Drobo Updates 5D to Turbo Edition 5Dt

Subject: Storage | June 21, 2016 - 06:43 PM |
Tagged: usb 3.0, Thunderbolt 2, raid, hdd, drobo, DAS, BeyondRAID, 5Dt, 5D

Today Drobo updated their 5D, shifting to Thunderbolt 2, an included mSATA caching SSD, and faster internals:

5bay-front-header.png

The new 5Dt (t for Turbo Edition) builds on the strengths of the 5D, which launched three years ago. The distinguishing features remain the same, as this is still a 5-bay model with USB 3.0, but the processor has been upgraded, as well as the USB 3.0 chipset, which was a bit finicky with some earlier implementations of the technology.

rear.png

The changes present themselves at the rear, as we now have a pair of Thunderbolt 2 (20 Gb/s) ports which support display pass-through (up to 4k). Rates speeds climb to 540 MB/s read and 250 MB/s write when using HDDs. SSDs bump those figures up to 545 / 285 MB/s, respectively.

bottom.jpg

Another feature that has remained was their Hot Data Cache technology, but while the mSATA part was optional on the 5D, a 128GB unit comes standard and pre-installed on the 5Dt.

The Drobo 5Dt is available today starting at $899. That price is a premium over the 5D, but the increased performance specs, included SSD, and Thunderbolt connectivity come at a price.

lineup.png

The current (updated) Drobo product lineup.

Full press blast after the break.

Source:

Looking to build your first PC this summer? We have a guide for you!

Subject: General Tech | June 21, 2016 - 06:32 PM |
Tagged:

We don't usually do this, but I have been getting a lot of emails and messages on social media from gamers looking to build their first PC this summer. With the release of the high end GeForce GTX 1080 and GTX 1070 cards this month, and the pending release of the Radeon RX 480 for more budget-minded gamers, there will likely never be a better time to get into PC gaming than now!

Back in February my nephew wanted to undertake building his own gaming PC for the first time. I took that opportunity to build an article and three video series for enthusiasts and DIYers that were either new to the game or needed a refresher on how to put screws to PCB, so to speak. With the numerous emails and messages I've been getting, I thought now would be a great time to bump our story back here and showcase to everyone how easy it can be to build your own PC, whether it be for gaming, VR, productivity or anything else.

gbsponsor.jpg

You can find the original story right here, sponsored by Gigabyte, but I have also re-embedded the videos below. Yes, the component selections we used in February could use some updating on the graphics card, monitor and maybe power supply, but the rest of the build summary is spot on and the build process remains unchanged.

Good luck to all the budding enthusiasts out there!

Allyn isn't the only one with a Lapdog on his couch

Subject: General Tech | June 20, 2016 - 09:40 PM |
Tagged: RGB, mouse, lapdog, keyboard, gaming control center, couchmaster, Couch, corsair

The Tech Report would like to back Al up in saying that gaming on a TV from the comfort of your couch is not as weird as some would think.  In their case it was Star Wars Battlefront and Civilization V which were tested out, Battlefront as it is a console game often played on a TV and Civ5 as it is not a twitch game and the extra screen real estate is useful.  They also like the device although they might like a smaller version so that keyboards without a numpad did not leave as much room ... perhaps a PocketDog?  Check out their quick review if Al's review almost sold you on the idea.

comparison2.png

"Corsair's Lapdog keyboard tray is built to bridge the gap between the desk and the den by giving gamers a way to put a keyboard and mouse right on their laps. We invited the Lapdog into our living room to see whether it's a good boy."

Here is some more Tech News from around the web:

Tech Talk

Windows 10 versus Ubuntu 16.04 versus NVIDIA versus AMD

Subject: Graphics Cards | June 20, 2016 - 08:11 PM |
Tagged: windows 10, ubuntu, R9 Fury, nvidia, linux, GTX1070, amd

Phoronix wanted to test out how the new GTX 1070 and the R9 Fury compare on Ubuntu with new drivers and patches, as well as contrasting how they perform on Windows 10.  There are two separate articles as the focus is not old silicon versus new but the performance comparison between the two operating systems.  AMD was tested with the Crimson Edition 16.6.1 driver, AMDGPU-PRO Beta 2 (16.20.3) driver as well as Mesa 12.1-dev.  There were interesting differences between the tested games as some would only support one of the two Linux drivers.  The performance also varies based on the game engine, with some coming out in ties, others seeing Windows 10 pull ahead and even some cases where your performance on Linux was significantly better.

NVIDIA's GTX 1080 and 1070 were tested using the 368.39 driver release for Windows and the 367.27 driver for Ubuntu.  Again we see mixed results, depending on the game Linux performance might actually beat out Windows, especially if OpenGL is an option. 

Check out both reviews to see what performance you can expect from your GPU when gaming under Linux.

image.php_.jpg

"Yesterday I published some Windows 10 vs. Ubuntu 16.04 Linux gaming benchmarks using the GeForce GTX 1070 and GTX 1080 graphics cards. Those numbers were interesting with the NVIDIA proprietary driver but for benchmarking this weekend are Windows 10 results with Radeon Software compared to Ubuntu 16.04 running the new AMDGPU-PRO hybrid driver as well as the latest Git code for a pure open-source driver stack."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: Phoronix

NVIDIA Announces PCIe Versions of Tesla P100

Subject: Graphics Cards | June 20, 2016 - 05:57 PM |
Tagged: tesla, pascal, nvidia, GP100

GP100, the “Big Pascal” chip that was announced at GTC, will be coming to PCIe for enterprise and supercomputer customers in Q4 2016. Previously, it was only announced using NVIDIA's proprietary connection. In fact, they also gave themselves some lead time with their first-party DGX-1 system, which retails for $129,000 USD, although we expect that was more for yield reasons. Josh calculated that each GPU in that system is worth more than the full wafer that its die was manufactured on.

nvidia-2016-gp100tesla.jpg

This brings us to the PCIe versions. Interestingly, they have been down-binned from the NVLink version. The boost clock has been dropped to 1300 MHz, from 1480 MHz, although that is matched with a slightly lower TDP (250W versus the NVLink's 300W). This lowers the FP16 performance to 18.7 TFLOPs, down from 21.2, FP32 performance to 9.3 TFLOPs, down from 10.6, and FP64 performance to 4.7 TFLOPs, down from 5.3. This is where we get to the question: did NVIDIA reduce the clocks to hit a 250W TDP and be compatible with the passive cooling technology that previous Tesla cards utilize, or were the clocks dropped to increase yield?

They are also providing a 12GB version of the PCIe Tesla P100. I didn't realize that GPU vendors could selectively disable HBM2 stacks, but NVIDIA disabled 4GB of memory, which also dropped the bus width to 3072-bit. You would think that the simplicity of the circuit would want to divide work in a power-of-two fashion, but, knowing that they can, it makes me wonder why they did. Again, my first reaction is to question GP100 yield, but you wouldn't think that HBM, being such a small part of the die, is something that they can reclaim a lot of chips by disabling a chunk, right? That is, unless the HBM2 stacks themselves have yield issues -- which would be interesting.

There is also still no word on a 32GB version. Samsung claimed the memory technology, 8GB stacks of HBM2, would be ready for products in Q4 2016 or early 2017. We'll need to wait and see where, when, and why it will appear.

Source: NVIDIA

If you bought directly from Acer over the past year, double check your spam and email

Subject: General Tech | June 20, 2016 - 05:21 PM |
Tagged: acer, security

North American customers of Acer who bought directly from them between May 12, 2015 and April 28, 2016 may have had their credit card numbers compromised.  Their less than secure customer database contained customer names, addresses, card numbers, and three-digit security verification codes all of which have been siphoned off at least once.  If this breach effected your account Acer will be sending a notification to you, you can see an example at The Register if you want to be sure you are receiving a valid notification.  For those who have seen fraudulent charges already this will be too late to mitigate their pain but anyone who used Acer's online shop during that time period would do well to get their cards changed.

Acer_logo_new.jpg

"Acer's insecure customer database spilled people's personal information – including full payment card numbers – into hackers' hands for more than a year."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

NVIDIA Releases 368.51 Hotfix Driver

Subject: Graphics Cards | June 19, 2016 - 02:37 AM |
Tagged: nvidia, graphics drivers

GeForce Hotfix 368.51 drivers have been released by NVIDIA through their support website. This version only officially addresses flickering at high refresh rates, although its number has been incremented quite a bit since the last official release (368.39) so it's possible that it rolls in other changes, too. That said, I haven't heard too many specific issues with 368.39, so I'm not quite sure what that would be.

nvidia-2015-bandaid.png

As always with a hotfix driver, NVIDIA pushed it out with minimal testing. It should pretty much only be installed if you have a specific issue (particularly the listed one(s)) and you don't want to wait until one is released that both NVIDIA and Microsoft looked over (although Microsoft's WHQL certification has been pretty lax since Windows 10).

Oddly enough, they only seem to list 64-bit links for Windows 8.1 and Windows 10. I'm not sure whether this issue doesn't affect Windows 7 and 32-bit versions of 8.1 and 10, or if they just didn't want to push the hotfix out to them for some reason.

Source: NVIDIA

ASUS Responds to GTX 1080 "Reviewer VBIOS" Concerns

Subject: Cases and Cooling | June 17, 2016 - 04:52 PM |
Tagged: asus, GTX 1080, strix, vbios

Yesterday, there were several news stories posted on TechpowerUp and others claiming that ASUS and MSI were sending out review samples of GTX 1080 and GTX 1070 graphics cards with higher clock speeds than retail parts. The insinuation of course is that ASUS was cheating, overclocking the cards going to media for reviews in order to artificially represent performance.

130f.jpg

Image source: Techpowerup

MSI and ASUS have been sending us review samples for their graphics cards with higher clock speeds out of the box, than what consumers get out of the box. The cards TechPowerUp has been receiving run at a higher software-defined clock speed profile than what consumers get out of the box. Consumers have access to the higher clock speed profile, too, but only if they install a custom app by the companies, and enable that profile. This, we feel, is not 100% representative of retail cards, and is questionable tactics by the two companies. This BIOS tweaking could also open the door to more elaborate changes like a quieter fan profile or different power management.

There was, and should be, a legitimate concern about these types of moves. Vendor one-up-manship could lead to an arms race of stupidity, similar to what we saw on motherboards and base frequencies years ago, where CPUs would run at 101.5 MHz base clock rather than 100 MHz (resulting in a 40-50 MHz total clock speed change) giving that board a slight performance advantage. However, the differences we are talking about with the GTX 1080 scandal are very small.

  • Retail VBIOS base clock: 1683 MHz
  • Media VBIOS base clock: 1709 MHz
  • Delta: 1.5%

And in reality, that 1.5% clock speed difference (along with the 1% memory clock rate difference) MIGHT result in ~1% of real-world performance changes. Those higher clock speeds are easily accessible to consumers by enabling the "OC Mode" in the ASUS GPU Tweak II software shipped with the graphics card. And the review sample cards can also be adjusted down to the shipping clock speeds through the same channel.

strix-1080-box.jpg

ASUS sent along its official statement on the issue.

ASUS ROG Strix GeForce GTX 1080 and GTX 1070 graphics cards come with exclusive GPU Tweak II software, which provides silent, gaming, and OC modes allowing users to select a performance profile that suits their requirements. Users can apply these modes easily from within GPU Tweak II.
 
The press samples for the ASUS ROG Strix GeForce GTX 1080 OC and ASUS ROG Strix GeForce GTX 1070 OC cards are set to “OC Mode” by default. To save media time and effort, OC mode is enabled by default as we are well aware our graphics cards will be reviewed primarily on maximum performance. And when in OC mode, we can showcase both the maximum performance and the effectiveness of our cooling solution.
 
Retail products are in “Gaming Mode” by default, which allows gamers to experience the optimal balance between performance and silent operation. We encourage end-users to try GPU Tweak II and adjust between the available modes, to find the best mode according to personal needs or preferences.
 
For both the press samples and retail cards, all these modes can be selected through the GPU Tweak II software. There are no differences between the samples we sent out to media and the retail channels in terms of hardware and performance.
 
Sincerely,
ASUSTeK COMPUTER INC.

While I don't believe that ASUS' intentions were entirely to save me time in my review, and I think that the majority of gamers paying $600+ for a graphics card would be willing to enable the OC mode through software, it's clearly a bad move on ASUS' part to have done this. Having a process in place at all to create a deviation from retail cards on press hardware is questionable, other than checking for functionality to avoid shipping DOA hardware to someone on a deadline. 

As of today I have been sent updated VBIOS for the GTX 1080 and GTX 1070 that put them into exact same mode as the retail cards consumers can purchase. 

We are still waiting for a direct response from MSI on the issue as well.

Hopefully this debacle will keep other vendors from attempting to do anything like this in the future. We don't need any kind of "quake/quack" in our lives today.

SK Hynix jumps into Enterprise SSDs with the SE3010

Subject: Storage | June 16, 2016 - 06:54 PM |
Tagged: SK Hynix, enterprise ssd, SE3010

SK Hynix's SE3010 uses their own controller, the eight channel SH87910AA Pearl and in the case of the 960GB model, eight 16nm 128Gb MLC NAND chips with a mysterious H27Q18YEB9a label and four capacitors to prevent data loss in the case of unexpected power loss.  The drive is optimized for read speeds and Kitguru's testing certainly shows that they were effective in their implementation.  Check out the write speed and overall conclusions in the full review.

P6071858.jpg

"When we last looked at an SSD from SK hynix it was from their consumer portfolio. This time around we are looking at a drive from the other part of their storage business in the shape of the SE3010, a read intensive drive for the Enterprise market space."

Here are some more Storage reviews from around the web:

Storage

Source: Kitguru

Must have been a good Computex

Subject: General Tech | June 16, 2016 - 04:30 PM |
Tagged: computex 2016, gx700, avalon, asus

The Tech Report must have been a little worn down by Computex, which is not uncommon as that week long show will take out even the heartiest of individuals.  Nevertheless they have managed to compose both themselves and a roundup article of everything they officially witnessed during the show.  They did pick up some unique photos such as the innards of the ASUS Avalon modular desktop PC as you can see below.  They also snapped a photo of the twin 330W power supplies required for the watercooled ASUS GX700 gaming laptop.  There are seven pages in total so grab a beverage and peruse at your leisure.

DSC_9289.jpg

"A couple weeks ago, we trekked all over Taipei to take in everything that Computex 2016 had to offer. Come with us and see the state of the PC in 2016, as interpreted by dozens of companies both small and large."

Here is some more Tech News from around the web:

Tech Talk

Podcast #404 - Crucial MX300, E3 hardware news, GTX 1080 Shortages and more!

Subject: General Tech | June 16, 2016 - 03:43 PM |
Tagged: XPoint, xbox one, void, video, Strider, Silverstone, rx 480, rx 470, rx 460, podcast, PHAB2, Optane, MX300, Lenovo, GTX 1080, Egil, crucial, corsair, asus, arm

PC Perspective Podcast #404 - 06/16/2016

Join us this week as we discuss the new Crucial MX300 SSD, news on upcoming Xbox hardware changes, GTX 1080 shortages and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

This episode of the PC Perspective Podcast is sponsored by Lenovo!

Hosts:  Ryan Shrout, Allyn Malventano, Jeremy Hellstrom, and Josh Walrath

Program length: 1:48:30
  1. Week in Review:
  2. News items of interest:
    1. 0:39:00 Xbox E3 Hardware Discussion
    2. 0:49:50 GeForce GTX 1080 Shortages?
  3. Hardware/Software Picks of the Week
    1. Ryan: Trackr
    2. Allyn: Safely Remove USB devices (or figure out what’s stopping them)
  4. Closing/outro

Rumor: AMD Plans 32-Core Opteron with 128 PCIe Lanes

Subject: Processors | June 16, 2016 - 03:18 AM |
Tagged: Zen, opteron, amd

We're beginning to see how the Zen architecture will affect AMD's entire product stack. This news refers to their Opteron line of CPUs, which are intended for servers and certain workstations. They tend to allow lots of memory, have lots of cores, and connect to a lot of I/O options and add-in boards at the same time.

amd-2016-e3-zenlogo.png

In this case, Zen-based Opterons will be available in two, four, sixteen, and thirty-two core options, with two threads per core (yielding four, eight, thirty-two, and sixty-four threads, respectively). TDPs will range between 35W and 180W. Intel's Xeon E7 v4 goes up to 165W got 24 cores (on Broadwell-EX) so AMD has a little more headroom to play with for those extra eight cores. That is obviously a lot, and it should be, again, good for cloud applications that can be parallelized.

As for the I/O side of things, the rumored chip will have 128 PCIe 3.0 lanes. It's unclear whether that is per socket, or total. Its wording sounds like it is per-CPU, although much earlier rumors have said that it has 64 PCIe lanes per socket with dual-socket boards available. It will also support sixteen 10-Gigabit Ethernet connections, which, again, is great for servers, especially with virtualization.

These are expected to launch in 2017. Fudzilla claims that “very late 2016” is possible, but also that it will launch after high-end desktop, which are expected to be delayed until 2017.

Source: Fudzilla

Skyrim Special Edition Announced (now with 64 bits!)

Subject: General Tech | June 16, 2016 - 02:01 AM |
Tagged: skyrim, bethesda

On Sunday, Bethesda had their E3 2016 press conference, where they announced a bunch of content that are relevant to PC gamers. One of them was Skyrim: Special Edition. It hasn't been added to their website yet, but it updates The Elder Scrolls V with new assets, shaders, and effects. On the PC, it will be free to anyone who has purchased the base game and all of its expansions.

Even better: it is also compiled as a 64-bit application.

bethesda-2016-skyrim-special-edition.png

One of the original Skyrim's limits, specifically for modders, was that it could only address a little over 3GB of system memory before crashing. Worse: RAM usage was interconnected with GPU memory usage, which further limits the number of assets you can actually load. While there are probably still plenty of ways for Skyrim to crash, especially when third-party content is injected, Skyrim: Special Edition will move the solid, 3GB wall.

DigitalFoundry also claims that the engine itself is updated to a newer branch itself, like what was used for Fallout 4. This makes sense, because several effects would be difficult to do on DirectX 9 (like volumetric god rays). Despite the newer engine version, Pete Hines of Bethesda said “basically, yes” when asked whether existing Skyrim mods would be compatible. This suggests that the internal API would be the same for at least the majority of cases. Interesting!

Skyrim: Special Edition will be available on October 28th.

Source: PCGamer

Silverstone expands their Argon series with the AR08

Subject: Cases and Cooling | June 15, 2016 - 07:48 PM |
Tagged: Argon Series, Silverstone, AR08

We have seen numerous examples of SilverStone's Argon series of heatsinks, dating back to the AR01 which Morry reviewed in 2014.  The AR08 is a new member of the series, 285g and 92x50x134mm with a 92mm fan and a $35 price tag.  The small size and price make a good choice for those on a budget and who chose a smaller case which precludes the use of a Morry special cooler.  As you might expect, the competition for this cooler is the stock cooler which came with your processor, which in [H]ard|OCP's testing that would be an i7-4770K.  Check out the full review to see how well it can outperform the stock cooler, in both heat and sound management.

1464635020KbGfWEqSem_2_5_l.jpg

"SilverStone's Argon Series AR08 looks to address those building a budget mid-level computer that balances performance and budget. It does however bring some enthusiast features with it like direct contact heatpipes, a 92mm PWM "diamond edged" fan, and noise dampening technologies. "

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

Source: [H]ard|OCP

New from the creator of X-COM; Phoenix Point

Subject: General Tech | June 15, 2016 - 06:19 PM |
Tagged: gaming, xcom, phoenix point

The turn based strategy and base management of X-COM will survive in Phoenix Point but from what was revealed by Rock, Paper, SHOTGUN you may not survive that long.  It is not just the mutated victims of the alien virus you need to be wary of but also your fellow surviving humans as there are several faction with very different and incompatible survival strategies.  The mutated enemies will not be broken down into distinct and repetitive races, instead they will evolve as you try to defeat them.  Sniper heavy tactics could result in aliens with reinforced front facing armour the next time you deal with them, use grenades and the next wave you face may be resistant to fire.  The game sounds very complex but it is not due for release until 2018 so there is plenty of time for them to make this game work.  Check out more by following the link.

phoenixpoint01.jpg

"One of the most exciting games in Los Angeles this week won’t be featured at press conferences or on the showfloor. Phoenix Point [official site] is the new tactical-strategy hybrid from Julian Gollop, the creator of the original X-COM, and we met yesterday to discuss its procedurally generated alien threats, simulated human factions and much more."

Here is some more Tech News from around the web:

Gaming

A sneak peek at two RX 470 benchmarks

Subject: General Tech | June 15, 2016 - 04:37 PM |
Tagged: rx 470, amd, leak, RX 480M

Sharp eyes over at The Guru of 3D spotted some information in a recent press release from AMD that might have been unintentionally released; performance numbers and mention of a AMD Radeon RX 480M.  These benchmarks are internal and so should be taken with a grain of salt but they do offer a glimpse at how the RX 470 will perform. The benchmarks were run on a system comprised of ab i7 5960X, 16GB memory and Radeon 16.20, showing better performance than a R9 270X on three games as well as Firestrike below.  Follow the link for the results they gleaned from the footnotes.

untitled-2.png

"In the slide-deck that was released yesterday some benchmark numbers have been, well almost hidden. But they are there. I added them into two charts to check out.

Let me clearly state that the benchmarks have been performed by AMD so we cannot verify quality settings. The scores have been derived from the footnotes of the PDF"

Here is some more Tech News from around the web:

Tech Talk

Source: Guru of 3D

How far can a GTX 1070 Founders Edition go?

Subject: Graphics Cards | June 14, 2016 - 05:46 PM |
Tagged: GTX1070, nvidia, overclocking

Overclocking the new Pascal GPUs can be accomplished with the EVGA Precision X tool as it allows you to bump up the power, temperature target and fan speed as well as the frequencies for the GPU and memory easily and effectively.  [H]ard|OCP set out to push the 1070 as far as it would go with this software in a recent review.  The power target can only be increased to 112%, which they implemented along with setting the fan to 100% as this is about the maximum performance, not about peace and quiet.  After quite a bit of testing they settled on 2062MHz GPU and 4252MHz RAM clocks as the highest stable frequency this particular card could manage.  The results show a card which leaves the TITAN X in the dirt and this card does not even have a custom cooler; we anxiously await the non-Founders Edition releases to see what they can accomplish.

1465810411UrsXB3Z0D6_1_1.gif

"In our overclocking review of the NVIDIA GeForce GTX 1070 Founders Edition we will see how far we can overclock the GPU and memory and then compare performance with GeForce GTX TITAN X and GeForce GTX 980 Ti. How high will she go? Can the $449 GTX 1070 outperform a $1000 GTX TITAN X? The answer is exciting."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Google's take on the quantum computer

Subject: General Tech | June 14, 2016 - 05:14 PM |
Tagged: google, quantum computing

 IBM, D-Wave and Google are the major players in quantum computing research, with each taking a different route towards developing a Universal Turing Machine using qubits; a machine that can perform all the computations of a traditional processor but at speeds exponentially faster.  Before the research discussed in this article at Nanotechweb, Google had focused on adiabatic solution which is essentially a quantum computer purpose built to solve a particular problem, not a machine capable of performing any data manipulation problem presented.  They have switched tactics have digitized their adiabatic quantum computer to allow for error correction and to allow for non-stoquastic interactions.  This should, in theory, allow for scalability thanks to the unique direction the research is taking.  The reading is rather heavy, especially if you follow the link to Nature but very interesting if you are curious about new methods of developing quantum computers.

200px-Quantum_computer.svg_.png

"Bringing together the best of two types of quantum computer for the first time, researchers at Google have created a prototype that combines the architecture of both a universal quantum computer and an analogue quantum computer."

Here is some more Tech News from around the web:

Tech Talk

Source: Nanotechweb

Now that's a NUC of a different colour, the NUC6i5SYK

Subject: Systems | June 13, 2016 - 08:03 PM |
Tagged: nuc, Intel, NUC6i5SYK, Skylake

The new NUC6i5SYK may look like the previous generations but the innards represent a huge step forward.  At the base is a Skylake Core i5-6260U which brings with it support for DDR4 and more importantly NVMe SSDs. Connectivity includes Ethernet, 802.11AC Dual Band WiFi, miniDP 1.2 and proper HDMI CEC 1.4b output.  The barebones kit will run $380USD, not bad for this type of design.  Missing Remote put the new NUC through its paces; check out the results here.

NUC6i5SYKtp_0.png

"Updated with an Intel Core i5-6260U with Intel Iris Graphics 540, support for NVMe SSD, and DDR4, the system has the opportunity to fix the shortcomings in the previous generation (cough, CSH). The sleek looks and features will not be as much of a bargain as the plug-in-and-go Intel Pentium based NUC5PGYH. Intel is asking $380/£335 for the barebones kit, but with quite a bit more performance, better networking, and features on tap, it could well be worth the extra dosh."

Here are some more Systems articles from around the web:

Systems

AMD "Sneak Peek" at RX Series (RX 480, RX 470, RX 460)

Subject: Graphics Cards, Processors | June 13, 2016 - 07:51 PM |
Tagged: amd, Polaris, Zen, Summit Ridge, rx 480, rx 470, rx 460

AMD has just unveiled their entire RX line of graphics cards at E3 2016's PC Gaming Show. It was a fairly short segment, but it had a few interesting points in it. At the end, they also gave another teaser of Summit Ridge, which uses the Zen architecture.

amd-2016-e3-470460.png

First, Polaris. As we know, the RX 480 was going to bring >5 TFLOPs at a $199 price point. They elaborated that this will apply to the 4GB version, which likely means that another version with more VRAM will be available, and that implies 8GB. Beyond the RX 480, AMD has also announced the RX 470 and RX 460. Little is known about the 470, but they mentioned that the 460 will have a <75W TDP. This is interesting because the PCIe bus provides 75W of power. This implies that it will not require any external power, and thus could be a cheap and powerful (in terms of esports titles) addition to an existing desktop. This is an interesting way to use the power savings of the die shrink to 14nm!

amd-2016-e3-backpackpc.png

They also showed off a backpack VR rig. They didn't really elaborate, but it's here.

amd-2016-e3-summitdoom.png

As for Zen? AMD showed the new architecture running DOOM, and added the circle-with-Zen branding to a 3D model of a CPU. Zen will be coming first to the enthusiast category with (up to?) eight cores, two threads per core (16 threads total).

amd-2016-e3-zenlogo.png

The AMD Radeon RX 480 will launch on June 29th for $199 USD (4GB). None of the other products have a specific release date.

Source: AMD