Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

Report: Leaked AMD Ryzen 7 1700X Benchmarks Show Strong Performance

Subject: Processors | February 21, 2017 - 03:54 PM |
Tagged: ryzen, rumor, report, R7, processor, leak, IPC, cpu, Cinebench, benchmark, amd, 1700X

VideoCardz.com, continuing their CPU coverage of the upcoming Ryzen launch, has posted images from XFASTEST depicting the R7 1700X processor and some very promising benchmark screenshots.

AMD-Ryzen-7-1700X.jpg

(Ryzen 7 1700X on the right) Image credit XFASTEST via VideoCardz

The Ryzen 7 1700X is reportedly an 8-core/16-thread processor with a base clock speed of 3.40 GHz, and while overall performance from the leaked benchmarks looks very impressive, it is the single-threaded score from the Cinebench R15 run pictured which really makes this CPU look like major competition for Intel with IPC.

AMD-Ryzen-7-1700X-Cinebench.jpg

Image credit XFASTEST via VideoCardz

An overall score of 1537 is outstanding, placing the CPU almost even with the i7-6900K at 1547 based on results from AnandTech:

AnandTech_Benchmarks.png

Image credit AnandTech

And the single-threaded performance score of the reported Ryzen 7 1700X is 154, which places it above the i7-6900K's score of 153. (It is worth noting that Cinebench R15 shows a clock speed of 3.40 GHz for this CPU, which is the base, while CPU-Z is displaying 3.50 GHz - likely indicating a boost clock, which can reportedly surpass 3.80 GHz with this CPU.)

Other results from the reported leak include 3DMark Fire Strike, with a physics score of 17,916 with Ryzen 7 1700X clocking in at ~3.90 GHz:

AMD-Ryzen-7-1700X-Fire-Strike-Physics.png

Image credit XFASTEST via VideoCardz

We will know soon enough where this and other Ryzen processors stand relative to Intel's current offerings, and if Intel will respond to the (rumored) price/performance double whammy of Ryzen. An i7-6900K retails for $1099 and currently sells for $1049 on Newegg.com, and the rumored pricing (taken from Wccftech), if correct, gives AMD a big win here. Competition is very, very good!

wccftech_chart.PNG

Chart credit Wccftech.com

Source: VideoCardz

50GB of high resolution Fallout, finally a use for that 8GB of VRAM?

Subject: General Tech | February 15, 2017 - 07:33 PM |
Tagged: gaming, fallout 4

[H]ard|OCP took a look into the effect on performance the gigantic high resolution texture pack has on system performance in their latest article.  For those who want the answer immediately, the largest amount of VRAM they saw utilized was a hair over 5GB, in most cases more than double the usage that the default textures use.  This certainly suggests that those with 4GB cards should reconsider installing the texture pack and that a 6GB card shouldn't see performance impacts.  As for the performance deltas, well we can't provide spoilers for their entire review!

h_quality.PNG

"Bethesda has released its official High Resolution Texture Pack DLC for Fallout 4. We will look at performance impact, VRAM capacity usage levels, and compare image quality to see if this High Resolution Texture Pack might be worthy of your bandwidth in actually improving the gameplay experience."

Here is some more Tech News from around the web:

Gaming

Source: [H]ard|OCP
Author:
Subject: Networking
Manufacturer: eero

Living the Mesh Life

Mesh networking is the current hot topic when it comes to Wi-Fi. Breaking from the trend of increasingly powerful standalone Wi-Fi routers that has dominated the home networking scene over the past few years, mesh networking solutions aim to provide wider and more even Wi-Fi coverage in your home or office through a system of multiple self-configuring and self-managing hotspots. In theory, this approach not only provides better wireless coverage overall, it also makes the setup and maintenance of a Wi-Fi network easier for novice and experienced users alike.

eero-1.jpg

Multiple companies have recently launched Wi-Fi mesh systems, including familiar names such as Google, Netgear, and Linksys. But this new approach to networking has also attracted newcomers, including San Francisco-based eero, one of the first companies to launch a consumer-targeted Wi-Fi mesh platform. eero loaned us their primary product, the 3-piece eero Home WiFi System, and we've spent a few weeks testing it as our home router.

eero-2.jpg

This review is the first part of a series of articles looking at Wi-Fi mesh systems, and it will focus on the capabilities and user experience of the eero Home WiFi System. Future articles will compare eero to other mesh platforms and traditional standalone routers, and look at comparative wireless performance and coverage.

Box Contents & Technical Specifications

As mentioned, we're looking at the 3-pack eero Home WiFi System (hereafter referred to simply as "eero"), a bundle that gives you everything you need to get your home or office up and running with a Wi-Fi mesh system. The box includes three eeros, three power adapters, and a 2-foot Ethernet cable.

eero-3.jpg

Each eero device is identical in terms of design and capability, measuring in at 4.75 inches wide, 4.75 inches deep, and 1.34 inches tall. They each feature two Gigabit Ethernet ports, a single USB 2.0 port (currently restricted to diagnostic use only), and are powered by two 2x2 MIMO Wi-Fi radios capable of supporting 802.11 a/b/g/n/ac. In addition, an eero network supports WPA2 Personal encryption, static IPs, manual DNS, IP reservations and port forwarding, and Universal Plug and Play (UPnP).

eero-4.jpg

Continue reading our early testing of the eero Home WiFi system!

Manufacturer: PC Perspective

Living Long and Prospering

The open fork of AMD’s Mantle, the Vulkan API, was released exactly a year ago with, as we reported, a hard launch. This meant public, but not main-branch drivers for developers, a few public SDKs, a proof-of-concept patch for The Talos Principle, and, of course, the ratified specification. This sets up the API to find success right out of the gate, and we can now look back over the year since.

khronos-2017-vulkan-alt-logo.png

Thor's hammer, or a tempest in a teapot?

The elephant in the room is DOOM. This game has successfully integrated the API and it uses many of its more interesting features, like asynchronous compute. Because the API is designed in a sort-of “make a command, drop it on a list” paradigm, the driver is able to select commands based on priority and available resources. AMD’s products got a significant performance boost, relative to OpenGL, catapulting their Fury X GPU up to the enthusiast level that its theoretical performance suggested.

Mobile developers have been picking up the API, too. Google, who is known for banishing OpenCL from their Nexus line and challenging OpenGL ES with their Android Extension Pack (later integrated into OpenGL ES with version 3.2), has strongly backed Vulkan. The API was integrated as a core feature of Android 7.0.

On the engine and middleware side of things, Vulkan is currently “ready for shipping games” as of Unreal Engine 4.14. It is also included in Unity 5.6 Beta, which is expected for full release in March. Frameworks for emulators are also integrating Vulkan, often just to say they did, but sometimes to emulate the quirks of these system’s offbeat graphics co-processors. Many other engines, from Source 2 to Torque 3D, have also announced or added Vulkan support.

Finally, for the API itself, The Khronos Group announced (pg 22 from SIGGRAPH 2016) areas that they are actively working on. The top feature is “better” multi-GPU support. While Vulkan, like OpenCL, allows developers to enumerate all graphics devices and target them, individually, with work, it doesn’t have certain mechanisms, like being able to directly ingest output from one GPU into another. They haven’t announced a timeline for this.

Should you hang out on the Bridge, or is it worth heading onto the Lake?

Subject: Systems | February 21, 2017 - 06:55 PM |
Tagged: upgrade, sandybridge, kaby lake

The tick-tock of Intel's waltz has stuttered a bit, with many users wondering if it is worth picking up a new Kaby Lake based system.  Gone are the good old days when a new generation of processors guaranteed enough of an increase in performance to justify decreasing your bank account immediately.  There are several reasons for this, including the difficulties in reducing the size of the process and increasing the amount of transistors, not just the current lack of competition in the marketplace.

At The Tech Report, one of their staff were curious enough to do the upgrade, dumping their  i7-2600K for an i7-7700k.  Check out the results of the upgrade, with some impressive effect on the wonky but beloved Arma III engine.

20170204105220_1.jpg

"The question of whether it's worth upgrading from Intel's Sandy Bridge chips accompanies every new TR CPU review. For one TR contributor, the arrival of Kaby Lake finally motivated him to make a move. See what the upgrade to a more modern platform did for him."

Here are some more Systems articles from around the web:

Systems

NVIDIA Releases GeForce 378.72 Hotfix (Bonus: a Discussion)

Subject: Graphics Cards | February 17, 2017 - 12:42 PM |
Tagged: nvidia, graphics drivers

Just a couple of days after publishing 378.66, NVIDIA released GeForce 378.72 Hotfix drivers. This fixes a bug encoding video in Steam’s In-Home Streaming, and it also fixes PhysX not being enabled on the GPU under certain conditions. Normally, hotfix drivers solve large-enough issues that were introduced with the previous release. This time, as far as I can tell, is a little different, though. Instead, these fixes seem to be intended for 378.66 but, for one reason or another, couldn’t be integrated and tested in time for the driver to be available for the game launches.

nvidia-2015-bandaid.png

This is an interesting effect of the Game Ready program. There is value in having a graphics driver available on the same day (or early) as a major game releases, so that people can enjoy the title as soon as it is available. There is also value in having as many fixes as the vendor can provide. These conditions oppose each other to some extent.

From a user standpoint, driver updates are cumulative, so they are able to skip a driver or two if they are not affected by any given issue. AMD has taken up a similar structure, some times releasing three or four drivers in a month with only, like, one of them being WHQL certified. For these reasons, I tend to lean on the side of “release ‘em as you got them”. Still, I can see people feeling a little uneasy about a driver being released incomplete to hit a due-date.

But, again, that due-date has value.

It’s interesting. I’m personally glad that AMD and NVIDIA are on a rapid-release schedule, but I can see where complaints could arise. What’s your opinion?

Source: NVIDIA

Need an AMD processor right this instant? Perhaps Ashes of the Singularity Escalation with it?

Subject: General Tech | February 17, 2017 - 12:25 AM |
Tagged: amd, newegg, ashes of the singularity

Do you have a desperate need for a new processor, which precludes waiting for Ryzen to arrive?  Newegg and AMD have launched a giveaway you might be interested in, a free copy of Ashes of the Singularity: Escalation with the purchase of certain 6 or 8 core AMD FX processors

ashes.PNG

Models include the AMD FX-8370 with Wraith cooler, FX-8350 BE, FX-8320, FX-8300, FX-6350 and FX-6300.  They may not be the newest chips on the block but they didn't cost very much and they lasted a long while; plus they are currently on sale.  The giveaway lasts until May 7, 2017, or when the keys run out, so you can keep an eye on pricing if you want even better pricing.

 

Source: AMD

Vulkan is not extinct, in fact it might be about to erupt

Subject: General Tech | February 15, 2017 - 06:29 PM |
Tagged: vulkan, Intel, Intel Skylake, kaby lake

The open source API, Vulkan, just received a big birthday present from Intel as they added official support on their Skylake and Kaby Lake CPUs under Windows 10.  We have seen adoption of this API from a number of game engine designers, Unreal Engine and Unity have both embraced it, the latest DOOM release was updated to support Vulkan and there is even a Nintendo 64 renderer which runs on it.  Ars Technica points out that both AMD and NVIDIA have been supporting this API for a while and that we can expect to see Android implementations of this close to the metal solution in the near future.

khronos-2016-vulkanlogo2.png

"After months in beta, Intel's latest driver for its integrated GPUs (version 15.45.14.4590) adds support for the low-overhead Vulkan API for recent GPUs running in Windows 10. The driver supports HD and Iris 500- and 600-series GPUs, the ones that ship with 6th- and 7th-generation Skylake and Kaby Lake processors."

Here is some more Tech News from around the web:

Tech Talk

Source: Ars Technica

Blizzard Cutting Support for Windows XP and Vista

Subject: General Tech | February 19, 2017 - 10:07 PM |
Tagged: pc gaming, blizzard, windows, EoL

Most companies have already abandoned Windows XP and Vista, including Microsoft once Vista leaves extended support in April, but Blizzard is known for long-term support. This is the company that is still selling Diablo 2, even producing retail disks for it last I checked, almost seventeen years after it was released (including a patch last year).

blizzard-battlenet-real01.jpg

Later this year, World of Warcraft, StarCraft II, Diablo III, Hearthstone, and Heroes of the Storm will no longer support Windows XP or Vista. This will not all happen at once, even though it would actually make less sense if they did. I mean, why would they coordinate several teams to release a patch at the same time and maximize annoyance to the affected users who cannot schedule or afford an upgrade at that specific time?

Although, if that’s you, then you should probably get around to it sooner than later.

Source: Blizzard

Microsoft Cancels February's Monthly Cumulative Update

Subject: General Tech | February 17, 2017 - 12:01 PM |
Tagged: windows 10, microsoft

Don’t worry if you didn’t receive cumulative Windows Updates this month.

At first, Microsoft showed no love for Valentine’s Day when they delayed the update that was supposed to roll out to the public. No explanation was provided. Two days later, Microsoft decided to write off the whole month. Everything that has been fixed since January 10th will be delayed until March 14th.

windows-10-bandaid.png

This is quite the wait. Peter Bright of Ars Technica notes that “off-cycle updates are also unpopular”. Yes, IT professionals hate it when software vendors are difficult to schedule around. I’m not sure how much that had to do with this decision, though. On the one hand, when a new build launches to the public, it’s not uncommon to have an update (or more) per week over the first couple of months. On the other hand, it would be reasonable for Microsoft to assume that customers, those who carefully test patches before deploying them, would not have ingested a huge, nebulous feature release into their network just weeks after launch. Still, out-of-band updates happen, and it’s interesting that it didn’t happen in this circumstance.

One thing that this patch should have fixed, however, is delayed or clipped display output in games (and other 3D applications) on multi-monitor systems. While not as critical as security, it is probably annoying for anyone affected to need to wait another 28 days. Microsoft claims it will be fixed then, though.

Source: Microsoft

Just Picked Up: Google Daydream View

Subject: Mobile | February 16, 2017 - 12:01 PM |
Tagged: zte, VR, google, daydream

As I mentioned last week, my ZTE Axon 7 has just received Android 7.0 Nougat, which also unlocked Google Daydream VR. I wasn’t able to pick up Google’s VR headset at the time, but now I can, so I did, and I spent a couple of hours messing around with it.

google-2017-daydream-view-closed.jpg

First, I have to say I am very glad that Google made one headset (and controller) that works with all Daydream-compatible phones. That wasn’t entirely clear when I ordered my Axon 7 last summer, and I feared it would lead to a lot of waiting for my OEM to release a specific headset that may or may not be as good as any another vendor’s, especially sight unseen. I don’t really know how they properly align the screens, across all possible phones and accounting for user-error, but it seems to accept my phone perfectly without really any fiddling. Maybe it’s a simpler problem than I am envisioning, but, either way, the one viewer works with my ZTE Axon 7 -- it’s not just for the Pixel.

My second point is that the phone gets very hot, very quick. I’m sure ZTE knows about this, and the phone is designed around it, but it doesn’t just get warm, it borders on hot-to-the-touch at times. To be safe, I’m removing the case each time I insert it into the Daydream View, although the device seems to work either way. The battery does drain quickly, relative to other workloads, but a single, hour-or-so VR sitting took about 25% off (~75% remaining). Despite the heat and the quick battery drain, you will probably be done with a VR sitting before the device is, so I consider those aspects to be all-around positive.

As for the YouTube app, I’m glad that the virtual screen for standard video can be adjusted in pretty much any way. You can make it bigger or smaller with the track pad, pull it in any direction with motion controls, and adjust whether it’s flat or curved (so all points are equidistant) in the settings. If you want to lay on your back in bed and watch movies “on the ceiling”, then you can... and without holding the phone over your face while your arms go numb.

Yes, I’m speaking from experience.

google-2017-daydream-view-open.jpg

As for games? Eh... about the only thing that caught my eye is maybe “Keep Talking and Nobody Explodes”. I’m pleasantly surprised it’s there, but it’s about the only thing that is. I knew there wasn’t a whole lot of apps, and that’s fine for me, but you probably shouldn’t get too excited outside of video.

Also, it’d be nice to see Google Chrome in VR. Get on that, Google! A virtual void would make a good place to keep my many, many tabs. It will apparently support WebVR content very soon, but in a “browse, click the Daydream button, mount it in the View, and put it on your head, then undo all that when you’re done” sort of way. It’d be nice to just... stay in the headset to browse.

Anywho, I have it, and those are my thoughts. It’s early, but it seems to work pretty well. I haven’t tried an Oculus Rift or an HTC Vive yet, though, so I can’t make any comparisons there.

Source: Google
Author:
Manufacturer: Corsair

Woof

The Corsair Bulldog, version 2.0, is now shipping and upgrades the internal components to a Z270 motherboard and upgrade fans with improved noise profiles. Other than that, the Bulldog looks, style, form and function remain unchanged.
 
 
The design is likely still dividing opinions; the short squat form factor with angular exterior will appeal to some, but the dimensions are perfectly suited to sitting in or around a TV and entertainment center. And because of the spacing and design of the interior, the mini ITX form factor can support any performance level of GPU and Intel's latest Kaby Lake processors. This gives users the flexibility to build for ultimate performance, making the dream of a 4K-ready gaming PC in the living room a possibility.
 
DSC02379.jpg
 
Our quick video review will talk about the design, installation process and our experiences with the system running some premium components.
 
bulldog1.jpg
  • Corsair Bulldog 2.0 (includes case, PSU, MB and CPU cooler)
  • Intel Core i7-7700K
  • 16GB Corsair Vengeance DDR4
  • Corsair Z270 Motherboard mini ITX
  • Corsair Hydro GTX 1080
  • 480GB Neutron GTX SSD
  • 600 watt Corsair PSU

The barebones kit starts at $399 through Corsair.com and includes the case, the motherboard, CPU cooler and 600-watt power supply. Not a bad price for those components!

bulldog2.jpg

You won't find any specific benchmarks in the video above, but you will find some impressions playing Resident Evil 7 in HDR mode at 4K resolution with the specs above, all on an LG OLED display. (Hint: it's awesome.) 

Biostar Launches X370GT7 Flagship Motherboard For Ryzen CPUs

Subject: Motherboards | February 21, 2017 - 10:16 AM |
Tagged: ryzen, M.2, ddr4, biostar, amd, AM4

Biostar is gearing up for AMD's Ryzen release with the launch of several new AM4 motherboards using the X370 and B350 chipsets. At the top of the product stack is the flagship X370GT7 motherboard.

Biostar X370GT7.jpg

The X370GT7 is part of Biostar's racing series and features a black PCB with checkered flag artwork and LED-backlit "armor" over the rear IO edge. The motherboard surrounds the AMD AM4 socket with two large heat spreaders cooling a 8+4 Digital Power+ power phase (PowIRstage IC), four DDR4 slots (up to 64GB at 2667 MHz), and a M.2 (32 Gbps) slot with bundled SSD heat spreader that matches the racing and carbon fiber aesthetic.

The bottom half of the AM4 Motherboard houses the X370 chipset, six SATA 3 ports, two PCI-E 3.0 x16 slots (running 1 at x16 or both at x8 with Ryzen, Bristol Ridge is limited to one x8 slot), one PCI-E 2.0 x16 (electrically x4) slot, and three PCI-E 2.0 x1 slots. Biostar also highlights the inclusion of 5050 LED headers and a USB 3.1 front panel header with "Lightning Charger" which supports Quick Charge 2.0 (12V@1.5A) as well as Apple devices (5V@2.4A).

Around back, the X370GT7 has the following rear IO ports:

  • PS/2
  • 1 x USB 3.1 Type-C
  • 1 x USB 3.1 Gen 2
  • 4 x USB 3.1 Gen 1 (USB 3.0)
  • 3 x Video Outputs:
    • 1 x DisplayPort (4K@60Hz)
    • 1 x HDMI 2.0 (4K@60Hz)
    • 1 x DVI-D (1200p@60Hz)
  • 1 x Gigabit Ethernet (Realtek RTL8118AS)
  • Audio (Realtek ALC1220, 8 channel Blu Ray Audio, "Biostar Hi-FI")
    • 5 x Analog out
    • 1 x S/PDIF

While an Intel NIC would have been nice to see, the Biostar board looks to offer up a decent package of connections and the Realtek audio codec has been around for a while and should be fairly well developed at this point though we will have to see how well Biostar's Hi-Fi implementation fares. Further, Biostar also offers a small touch panel on the board called GT Touch that lets users switch UEFI profiles between performance and eco-friendly modes as well as power and reset buttons for testing outside of a case. For LED fans Biostar bundles software called "LED DJ" that lets you configure an LED light show that responds to music being played on the PC. (Yes, this is a thing now hehe.)

It is nice to see Biostar rising to the occasion and offering up more options for Ryzen CPUs. Unfortunately as is the case with more things there is no word on pricing or availability yet though rumors would suggest an early march release to coincide with Ryzen processors hitting store shelves.

Also read:

Source: Biostar
Subject: General Tech
Manufacturer: Logitech

Is Mechanical Mandatory?

The Logitech G213 Prodigy gaming keyboard offers the company's unique Mech-Dome keys and customizable RGB lighting effects, and it faces some stiff competition in a market overflowing with gaming keyboards for every budget (including mechanical options). But it really comes down to performance, feel, and usability; and I was interested in giving these new Mech-Dome keys a try.

DSC_0669.jpg

“The G213 Prodigy gaming keyboard features Logitech Mech-Dome keys that are specially tuned to deliver a superior tactile response and performance profile similar to a mechanical keyboard. Mech-Dome keys are full height, deliver a full 4mm travel distance, 50g actuation force, and a quiet sound operation.

The G213 Prodigy gaming keyboard was designed for gaming, featuring ultra-quick, responsive feedback that is up to 4x faster than the 8ms report rate of standard keyboards and an anti-ghosting matrix that keeps you in control when you press multiple gaming keys simultaneously.”

I will say that at $69.99 the G213 plays in a somewhat odd space relative to the current gaming keyboard market; though it would be well positioned in a retail setting, where at a local Best Buy it would be a compelling option vs. a $100+ mechanical option. But savvy internet shoppers see the growing number of <$70 mechanical keyboards available and might question the need for a ‘quasi-mechanical’ option like this. I don’t review products from a marketing perspective, however, and I simply set out to determine if the G213 is a well-executed product on the hardware front.

DSC_0662.jpg

Continue reading our review of the Logitech G213 gaming keyboard!

Podcast #437 - EVGA iCX, Zen Architecture, Optane, and more!

Subject: Editorial | February 16, 2017 - 06:36 PM |
Tagged: Zen, Z170, webkit, webgpu, podcast, Optane, nvidia, Intel, icx, evga, ECS, crucial, Blender, anidees, amd

PC Perspective Podcast #437 - 02/16/17

Join us for EVGA iCX, Zen Architechure, Intel Optane, new NVIDIA and AMD driver releases, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts: Allyn Malventano, Ken Addison, Josh Walrath, Jermey Hellstrom

Program length: 1:32:21

Source:

Good news everybody! Those 30 second Youtube ads are going the way of the dodo

Subject: General Tech | February 17, 2017 - 08:35 PM |
Tagged: youtube

Perhaps someone at Youtube noticed that most people flip to another tab or browser window during those unskippable ads that are frequently played at the beginning of videos.  Whatever the cause of the sudden outbreak of common sense, as of 2018 there will no longer be 30 second long ads which are unskippable.  This does not mean you will be free of ads, there will instead be unskippable ads of 15-20 seconds for you to ignore and you will still have ads in the middle of long videos.  They do have to sell more Red subscriptions after all.  Slashdot has linked to the original statement if you seek confirmation.

photo.jpg

"We're committed to providing a better ads experience for users online. As part of that, we've decided to stop supporting 30-second unskippable ads as of 2018 and focus instead on formats that work well for both users and advertisers," Google said."

Here is some more Tech News from around the web:

Tech Talk

Source: Slashdot

Intel Details Optane Memory System Requirements

Subject: General Tech, Storage | February 22, 2017 - 12:14 AM |
Tagged: Optane, kaby lake, Intel, 3D XPoint

Intel has announced that its Optane memory will require an Intel Kaby Lake processor to function. While previous demonstrations of the technology used an Intel Skylake processor, it appears this configuration will not be possible on the consumer versions of the technology.

Intel Optane App Accelerator.jpg

Further, the consumer application accelerator drives will also require a 200-series chipset motherboard, and either a M.2 2280-S1-B-M or M.2 2242-S1-B-M connector with two or four PCI-E lanes. Motherboards will have to support NVMe v1.1 and Intel RST (Rapid Storage Technology) 15.5 or newer.

It is not clear why Intel is locking Optane technology to Kaby Lake and whether it is due to technical limitations that they were not able to resolve to keep Skylake compatible or if it is just a matter of not wanting to support the older platform and focus on its new Kaby Lake processors. As such, Kaby Lake is now required if you want UHD Blu Ray playback and Optane 3D XPoint SSDs.

What are your thoughts on this latest bit of Optane news? Has Intel sweetened the pot enough to encourage upgrade hold outs?

Also Read: 

 

Source: Bit-Tech

Intel Quietly Launches Official Optane Memory Site

Subject: Storage | February 16, 2017 - 01:58 AM |
Tagged: XPoint, ssd, Optane, memory, Intel, cache

We've been hearing a lot about Intel's upcoming Optane memory over the past two years, but the information had all been in the form of press announcements and leaked roadmap slides.

optane-memory-marquee-16x9.png.rendition.intel_.web_.1072.603.png

We now have an actual Optane landing page on the Intel site that discusses the first iteration of 'Intel Optane Memory', which appears to be the 8000p Series that we covered last October and saw as an option on some upcoming Lenovo laptops. The site does not cover the upcoming enterprise parts like the 375GB P4800X, but instead, focuses on the far smaller 16GB and 32GB 'System Accelerator' M.2 modules.

intel-optane-memory-8000p.jpg

Despite using only two lanes of PCIe 3.0, these modules turn in some impressive performance, but the capacities when using only one or two (16GB each) XPoint dies preclude an OS install. Instead, these will be used, presumably in combination with a newer form of Intel's Rapid Storage Technology driver, as a caching layer meant as an HDD accelerator:

While the random write performance and endurance of these parts blow any NAND-based SSD out of the water, the 2-lane bottleneck holds them back compared to high-end NVMe NAND SSDs, so we will likely see this first consumer iteration of Intel Optane Memory in OEM systems equipped with hard disks as their primary storage. A very quick 32GB caching layer should help speed things up considerably for the majority of typical buyers of these types of mobile and desktop systems, while still keeping the total cost below that for a decent capacity NAND SSD as primary storage. Hey, if you can't get every vendor to switch to pure SSD, at least you can speed up that spinning rust a bit, right?

Source: Intel

A good year to sell GPUs

Subject: General Tech | February 21, 2017 - 06:18 PM |
Tagged: jon peddie, marketshare, graphics cards

The GPU market increased 5.6% from Q3 to Q4 of 2016, beating the historical average of -4.7% by quite a large margin, over the year we saw an increase of 21.1%.  That increase is even more impressive when you consider that the total PC market dropped 10.1% in the same time, showing that far more consumers chose to upgrade their existing machines instead of buying new ones.  This makes sense as neither Intel nor AMD offered a compelling reason to upgrade your processor and motherboard for anyone who purchased one in the last two or three years.

AMD saw a nice amount of growth, grabbing almost 8% of the total market from NVIDIA over the year, though they lost a tiny bit of ground between Q3 and Q4 of 2016.  Jon Peddie's sample also includes workstation class GPUs as well as gaming models and it seems a fair number of users chose to upgrade their machines as that market increased just over 19% in 2016.

unnamed.png

"The graphics add-in board market has defied gravity for over a year now, showing gains while the overall PC market slips. The silly notion of integrated graphics "catching up" with discrete will hopefully be put to rest now," said Dr. Jon Peddie, president of Jon Peddie research, the industry's research and consulting firm for graphics and multimedia."

Here is some more Tech News from around the web:

Tech Talk

Subject: Mobile
Manufacturer: Huawei

Introduction and Specifications

The Mate 9 is the current version of Huawei’s signature 6-inch smartphone, building on last year’s iteration with the company’s new Kirin 960 SoC (featuring ARM's next-generation Bifrost GPU architecture), improved industrial design, and exclusive Leica-branded dual camera system.

Mate9_Main.jpg

In the ultra-competitive smartphone world there is little room at the top, and most companies are simply looking for a share of the market. Apple and Samsung have occupied the top two spots for some time, with HTC, LG, Motorola, and others, far behind. But the new #3 emerged not from the usual suspects, but from a name many of us in the USA had not heard of until recently; and it is the manufacturer of the Mate 9. And comparing this new handset to the preceding Mate 8 (which we looked at this past August), it is a significant improvement in most respects.

With this phone Huawei has really come into their own with their signature phone design, and 2016 was a very good product year with the company’s smartphone offerings. The P9 handset launched early in 2016, offering not only solid specs and impressive industrial design, but a unique camera that was far more than a gimmick. Huawei’s partnership with Leica has resulted in a dual-camera system that operates differently than systems found on phones such as the iPhone 7 Plus, and the results are very impressive. The Mate 9 is an extension of that P9 design, adapted for their larger Mate smartphone series.

DSC_0649.jpg

Continue reading our review of the Huawei Mate 9!