Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

AMD Releases Radeon Software Crimson Edition 16.5.3

Subject: Graphics Cards | May 25, 2016 - 01:46 AM |
Tagged: vulkan, radeon, overwatch, graphics driver, Crimson Edition 16.5.3, crimson, amd

AMD has released new drivers for Overwatch (and more) with Radeon Software Crimson Edition 16.5.3.

amd-2015-crimson-logo.png

"Radeon Software Crimson Edition is AMD's revolutionary new graphics software that delivers redesigned functionality, supercharged graphics performance, remarkable new features, and innovation that redefines the overall user experience. Every Radeon Software release strives to deliver new features, better performance and stability improvements."

AMD lists these highlights for Radeon Software Crimson Edition 16.5.3:

Support for:

  • Total War: Warhammer
  • Overwatch
  • Dota 2 (with Vulkan API)

New AMD Crossfire profile available for:

  • Total War: Warhammer
  • Overwatch

The driver is available from AMD from the following direct links:

The full release notes with fixed/known issues is available at the source link here.

Source: AMD

NVIDIA Releases 368.22 Drivers for Overwatch

Subject: Graphics Cards | May 24, 2016 - 10:36 PM |
Tagged: nvidia, graphics drivers

Yesterday, NVIDIA has released WHQL-certified drivers to align with the release of Overwatch. This version, 368.22, is the first public release of the 367 branch. Pascal is not listed in the documentation as a supported product, so it's unclear whether this will be the launch driver for it. The GTX 1080 comes out on Friday, but two drivers in a week would not be unprecedented for NVIDIA.

While NVIDIA has not communicated this too well, 368.22 will not install on Windows Vista. If you are still using that operating system, then you will not be able to upgrade your graphics drivers past 365.19. 367-branch (and later) drivers will required Windows 7 and up.

nvidia-geforce.png

Before I continue, I should note that I've experienced so issues getting these drivers to install through GeForce Experience. Long story short, it took two attempts (with a clean install each time) to end up with a successful boot into 368.22. I didn't try the standalone installer that you can download from NVIDIA's website. If the second attempt using GeForce Experience failed, then I would have. That said, after I installed it, it seemed to work out well for me with my GTX 670.

While NVIDIA is a bit behind on documentation, the driver also rolls in other fixes. There were some GPU compute developers who had crashes and other failures in certain OpenCL and CUDA applications, which are now compatible with 368.22. I've also noticed that my taskbar hasn't been sliding around on its own anymore, but I've only been using the driver for a handful of hours.

You can get GeForce 368.22 drivers from GeForce Experience, but you might want to download the standalone installer (or skip a version or two if everything works fine).

Source: NVIDIA

ASUS and Gigabyte boards see a small growth

Subject: General Tech | May 24, 2016 - 05:46 PM |
Tagged: asus, gigabyte

According to the information that DigiTimes was able to garner, only Asustek Computer and Gigabyte Technology will see their shipment of motherboards over this first half of the year either remain the same as last year or perhaps experience a small growth.  As neither are expected to break 20 million units this is not great news but is certainly better than the news ASRock, MSI, ECS and Biostar are expecting.  The lack of competition in the CPU/APU market is spilling over to motherboard manufacturers as customers are not immediately upgrading to the new platforms but are instead choosing premedical upgrades.  The next few quarters are going to be interesting as we see what strategies motherboard manufacturers adopt to retain sales.  New boards based on the Intel 200 series chipset will not likely be a factor until the last quarter of this year, at the earliest.

Diamond_road_sign_steep_decline.svg_.png

"The sources expect Asustek and Gigabyte's motherboard shipments in 2016 to stay at the same level in 2015 and neither of them is able to break 20 million units. Since overall demand continues shrinking, the top-2 players are likely to continue taking market share from lower-tier players."

Here is some more Tech News from around the web:

Tech Talk

Source: DigiTimes
Subject: Storage
Manufacturer: Toshiba (OCZ)

Introduction, Specifications and Packaging

Introduction:

The OCZ RevoDrive has been around for a good long while. We looked at the first ever RevoDrive back in 2010. It was a bold move for the time, as PCIe SSDs were both rare and very expensive at that time. OCZ's innovation was to implement a new VCA RAID controller which kept latencies low and properly scaled with increased Queue Depth. OCZ got a lot of use out of this formula, later expanding to the RevoDrive 3 x2 which expanded to four parallel SSDs, all the way to the enterprise Z-Drive R4 which further expanded that out to eight RAIDed SSDs.

110911-140303-5.5.jpg

OCZ's RevoDrive lineup circa 2011.

The latter was a monster of an SSD both in physical size and storage capacity. Its performance was also impressive given that it launched five years ago. After being acquired by Toshiba, OCZ re-spun the old VCA-driven SSD one last time in the form of a RevoDrive 350, but it was the same old formula and high-latency SandForce controllers (updated with in-house Toshiba flash). The RevoDrive line needed to ditch that dated tech and move into the world of NVMe, and today it has!

DSC00772.jpg

Here is the new 'Toshiba OCZ RD400', branded as such under the recent rebadging that took place on OCZ's site. The Trion 150 and Vertex 180 have also been relabeled as TR150 and VT180. This new RD400 has some significant changes over the previous iterations of that line. The big one is that it is now a lean M.2 part which can come on/with an optional adapter card for those not having an available M.2 slot.

Read on for our full review of the new OCZ RD400!

In Win Releases 303 Mid-Tower Steel and Glass Enclosure

Subject: Cases and Cooling | May 23, 2016 - 03:50 PM |
Tagged: tempered glass, steel, SECC, mid-tower, In Win 303, in win, enclosure, case, atx case

In Win's 303 enclosure has been released, and this mid-tower offers a very clean, simple look, and provides the option of a tempered glass side panel to show off your build.

inwin_303_main.jpg

"The IN WIN team presents the 303, a simple, yet elegant computer chassis crafted from steel and tempered glass. The distinctively clean front panel is complemented with a bright LED design to balance the overall appearance.

The IN WIN logo is highlighted “Neon” as well as the lucent stripped I/O front panel. These gorgeous LEDs also have the purpose of indicating when the PC is activated."

INWIN_303.png

The In Win 303 is available with a white or black finish

The 303 is constructed primarily of SECC (electrogalvanized steel), which should help them keep costs down versus aluminum designs at the expense of some added weight.

303_sides.PNG

The 303 offers buyers a choice between a pair of side panel styles, with both tempered glass and solid aluminum options. As to the former, the company states you are "able to remove the beautiful 3mm tempered glass side panel by just pressing the handle," while the aluminum panel is affixed with thumbscrews.

303_interior.jpg

In Win 303 interior

While officially released with a product page up on In Win's site, actual retail availability in the U.S. might have to wait until after Computex, as listings have yet to appear on Newegg or Amazon for the new 303 (at time of publication).

Source: In Win
Subject: Motherboards
Manufacturer: GIGABYTE

Introduction and Technical Specifications

Introduction

02-board.jpg

Courtesy of GIGABYTE

The X99P-SLI motherboard is the newest member of their Ultra Durable board line, updated with newest technological innovations including USB 3.1 and Thunderbolt 3. The board supports all Intel LGA2011-3 based processors paired with DDR4 memory in up to a quad channel configuration. GIGABYTE priced the X99P-SLI at an approachable MSRP of $249.99.

03-board-flyapart.jpg

Courtesy of GIGABYTE

04-mosfet-expl.jpg

Courtesy of GIGABYTE

05-pcie-explained.jpg

Courtesy of GIGABYTE

Like all members of the Ultra-Durable board line, the X99P-SLI was over engineered to take whatever abuse is thrown its way, featuring an 6+4-phase digital power system with International Rectify Gen 4 digital PWM controllers and Gen 3 PowIRstage controllers, Server Level chokes, and long life Durable Black Solid capacitors. The board also features GIGABYTE's next generation PCIe x16 slots with PCIe Metal Shielding - steel reinforced overlays to provide extra vertical support for graphics cards featuring large and heavy coolers.

Continue reading our review of the GIGABYTE X99P-SLI motherboard!

Was Google holding back at I/O? Not really; but here are a few things you might have missed.

Subject: General Tech | May 20, 2016 - 04:29 PM |
Tagged: google

The Inquirer put together a list of topics that received little to no coverage during the Google I/O keynote, though why Daydream VR was included is hard to say as it was all over the news yesterday.  The Google Play store coming to Chromebooks and Android Pay arriving in the UK were also services we knew about but which did not get a mention.  On the other hand, Google's Tensor Processing Unit really should have received more emphasis as it is rather impressive. If AMD, NVIDIA and Intel were hoping to get a contract from Google to power the next generation of Deep Dream or Google Assistant then they are out of luck.  Take a peek at the other topics The Inq wanted to hear more about here.

1463589742-iofiller.jpg

"As such, here's the INQ top 10 announcements that got bumped from the I/O keynote to a footnote or out of the main speech altogether."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

A trio of high powered gaming laptops

Subject: Mobile | May 19, 2016 - 07:12 PM |
Tagged: msi, origin, eurocom, gaming laptop, GTX 980M

You have likely run into the 17.3" MSI GT72S Dominator Pro Dragon Edition but have you seen either the Origin EON17-SLX or Eurocom SKY X9 gaming laptop.  The Eurocom especially is rather impressive, a 4K panel powered by two GTX 980Ms in SLI, the other two have only a single GPU, though it is the desktop version of the GTX 980.

The least expensive of these laptops is $2899, the most expensive is $4837.  Take a look at how these beasts perform over at Hardware Canucks.

HIGH-END-ROUNDUP-28.jpg

"Gaming performance of high end gaming notebooks is quickly closing the gap with desktops. In this roundup we look at over $11,000 worth of desktop replacement options from MSI, Origin and Eurocom."

Here are some more Mobile articles from around the web:

More Mobile Articles

Podcast #400 - Talking GTX 1080 Performance, GTX 1070 specs, AMD Polaris leaks and more!

Subject: General Tech | May 19, 2016 - 04:08 PM |
Tagged: video, radeon, polaris 11, polaris 10, Polaris, podcast, pascal, nvidia, GTX 1080, gtx 1070, gtx, geforce, arm, amd, 10nm

PC Perspective Podcast #400 - 05/19/2016

Join us this week as we discuss the GTX 1080 performance and features, official specifications of the GTX 1070, new Polaris specification rumors, ARM's 10nm chip test and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

Lian Li Announces Ebonsteel Line of Affordable Steel Cases

Subject: Cases and Cooling | May 19, 2016 - 02:43 PM |
Tagged: steel, matx case, Lian Li, enclosure, case, atx case

Lian Li is now producing steel enclosures (gasp), which translates into some more affordable options on the market from this venerable brand that typically produces products from aluminum and glass.

lian_li_steel.png

(left to right) Lian Li PC-K5, PC-K6, and PC-K6S

“Lian Li has been known for its signature brushed aluminum aesthetic for decades. For many years, a number of customers have requested more affordable alternatives. The Ebonsteel series is intended to provide this option for those who seek the best value in PC cases. While these chassis are built with steel rather than aluminum, they are still very much Lian Li cases – tool-less building features, grommeted panel cutouts for cable management, vibration-dampened PSU mounts and drive cages, removable mesh filters on fan mounts – they include many of the bells and whistles of a typical Lian Li case!”

The three new enclosures Lian Li is featuring in their news release include the mid-tower PC-K5 and PC-K6, and a silent version of the latter named PC-K6S. Of these the PC-K5 is a budget-minded option, with an MSRP of $56 for the standard version, and $60 if you'd like a side panel window. The PC-K6 is a more premium offering, with an MSRP of $93, with the silent version at $109.

As to availability, I'll quote the press release here: "The Ebonsteel cases will be available in mid June for K5, and late June for K6, K6S."

Source: Lian Li

AMD Gains Market Share in Q1'16 Discrete GPUs

Subject: Graphics Cards | May 18, 2016 - 10:11 PM |
Tagged: amd, radeon, market share

AMD sent out a note yesterday with some interesting news about how the graphics card market fared in Q1 of 2016. First, let's get to the bad news: sales of new discrete graphics solutions, in both mobile and desktop, dropped by 10.2% quarter to quarter, a decrease that was slightly higher than expected. Though details weren't given in the announcement or data I have from Mercury Research, it seems likely that expectations of upcoming new GPUs from both NVIDIA and AMD contributed to the slowdown of sales on some level.

Despite the shrinking pie, AMD grabbed more of it in Q1 2016 than it had in Q4 of 2015, gaining on total market share by 3.2% for a total of 29.4%. That's a nice gain in a short few months but its still much lower than Radeon has been as recently as 2013. That 3.2% gain includes both notebook and desktop discrete GPUs, but let's break it down further.

  Q1'16 Desktop Q1'16 Desktop Change Q1'16 Mobile Q1'16 Mobile Change
AMD 22.7% +1.8% 38.7% +7.3%
NVIDIA (assumed) ~77% -1.8% ~61% -7.3%

AMD's gain in the desktop graphics card market was 1.8%, up to 22.7% of the market, while the notebook discrete graphics share jumped an astounding 7.3% to 38.7% of the total market.

NVIDIA obviously still has a commanding lead in desktop add-in cards with more than 75% of the market, but Mercury Research believes that a renewed focus on driver development, virtual reality and the creation of the Radeon Technologies Group attributed to the increases in share for AMD.

Q3 of 2016 is where I think the future looks most interesting. Not only will NVIDIA's newly released GeForce GTX 1080 and upcoming GTX 1070 have time to settle in but the upcoming Polaris architecture based cards from AMD will have a chance to stretch their legs and attempt to continue pushing the needle in the upward direction.

SilverStone's AR07, a quiet small sized cooler for a small sized system

Subject: General Tech | May 18, 2016 - 08:25 PM |
Tagged: Silverstone, Argon ar07, SFF

SilverStone's Argon series are specialized for systems built in small cases and where noise can be an issue.  It measures 140x50x159mm, perfect for fitting into a smaller system and from [H]ard|OCP's testing we see the cooler produces 40.9db(A) maximum.  This does mean the cooler is not for systems you plan to heavily overclock but the performance is good enough to support a minor boost if your HTPC needs a bit more power.  Check out the full review here.

14626602053Ex0dBmAKZ_2_7_l.jpg

"The Argon Series AR07 CPU air cooler is billed by Silverstone as being, "For users looking for a no-nonsense top performing cooler without the premium price, the Argon AR07 is the perfect choice." Three heatpipes, some fins, and a 140mm fan is no-nonsense in our book, so how does it cool?"

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

Source: [H]ard|OCP

Microsoft Releases Windows 7 "Convenience" Roll-Up

Subject: General Tech | May 18, 2016 - 07:40 PM |
Tagged: microsoft, Windows 7, Windows 8.1

I know this sounds like yet another story where Microsoft attempts to ram Windows 10 down your throat, but it's not (apart from a potential interpretation of the last paragraph). It's been about six-and-a-half years since Windows 7 launched, and about five years since Service Pack 1. If you've attempted to install Windows 7 recently, then attempting to run Windows Update makes it painfully obvious how long that's been.

microsoft-2016-windows7-update.png

Image Credit: Microsoft

Finally, Microsoft is making an official roll-up available. Better? It can be slipstreamed into install media, so you don't even need to go through that step with each reformat. This will not contain every possible update, though. Microsoft lists 23 patches that they excluded based on three conditions:

  • “They don't have broad applicability.”
  • “They introduce behavior changes.”
  • “They require additional user actions, such as making registry settings.”

They also excluded every update to Internet Explorer, which makes sense. Users can install Internet Explorer 11 and update it, or just uninstall it entirely if they want (after they download whatever browser(s) that they will actually use). While some of these excluded fixes will affect many users, it should be a much better experience than several hundred patches and a half-dozen reboots. It's probably better to let the user choose many of these optional updates by hand anyway.

At the same time, they also announced that “non-security updates” will be merged into a monthly roll-up for both Windows 7 and Windows 8.1 (and several versions of Windows Server). They're not too clear about how this will work, but it sounds like users will not be able to pick and choose parts of optional patches anymore. Given how many of these were attempts to, again, shove Windows 10 down our throats, that's a bit of a concern. However, I suspect that this is just so Microsoft can align its release structure to how it's done on Windows 10. It's probably just easier for them to manage.

Source: Microsoft

DOOM is back

Subject: General Tech | May 18, 2016 - 05:39 PM |
Tagged: gaming, doom

Most reviewers agree, the new DOOM is a callback to the old days of run and gun shooters, not the overly prevalent cover shooters of today.  The speed is the key to having fun, leaping up obstacles, chainsawing demons when you are low on ammo or simply putting your boot through them.  The shotgun comes early and does exactly what you want it to, or you can choose different favourites from the arsenal you are sure to accumulate.  Rock, Paper, SHOTGUN were hoping for a little more variation in the common demon types and the inevitable mod to enlarge the colour palette used in the game but are having a great time already.  Check out their first impressions here if you have yet to find the time to play.

doomi2.jpg

"It’s early doors of course, so anything I say below may well become incorrect depending on how things shake out later on. I also haven’t dabbled in multiplayer yet, but will go hang my hide out for an online beating a little later today. "

Here is some more Tech News from around the web:

Gaming

New AMD Polaris 10 and Polaris 11 GPU Details Emerge

Subject: Editorial, Graphics Cards | May 18, 2016 - 05:18 PM |
Tagged: rumor, Polaris, opinion, HDMI 2.0, gpu, gddr5x, GDDR5, GCN, amd, 4k

While Nvidia's Pascal has held the spotlight in the news recently, it is not the only new GPU architecture debuting this year. AMD will soon be bringing its Polaris-based graphics cards to market for notebooks and mainstream desktop users. While several different code names have been thrown around for these new chips, they are consistently in general terms referred to as Polaris 10 and Polaris 11. AMD's Raja Kudori stated in an interview with PC Perspective that the numbers used in the naming scheme hold no special significance, but eventually Polaris will be used across the entire performance lineup (low end to high end graphics).

Naturally, there are going to be many rumors and leaks as the launch gets closer. In fact, Tech Power Up recently came into a number of interesting details about AMD's plans for Polaris-based graphics in 2016 including specifications and which areas of the market each chip is going to be aimed at. 

AMD GPU Roadmap.jpg

Citing the usual "industry sources" familiar with the matter (take that for what it's worth, but the specifications do not seem out of the realm of possibility), Tech Power Up revealed that there are two lines of Polaris-based GPUs that will be made available this year. Polaris 10 will allegedly occupy the mid-range (mainstream) graphics option in desktops as well as being the basis for high end gaming notebook graphics chips. On the other hand, Polaris 11 will reportedly be a smaller chip aimed at thin-and-light notebooks and mainstream laptops.

Now, for the juicy bits of the leak: the rumored specifications!

AMD's "Polaris 10" GPU will feature 32 compute units (CUs) which TPU estimates – based on the assumption that each CU still contains 64 shaders on Polaris – works out to 2,048 shaders. The GPU further features a 256-bit memory interface along with a memory controller supporting GDDR5 and GDDR5X (though not at the same time heh). This would leave room for cheaper Polaris 10 derived products with less than 32 CUs and/or cheaper GDDR5 memory. Graphics cards would have as much as 8GB of memory initially clocked at 7 Gbps. Reportedly, the full 32 CU GPU is rated at 5.5 TFLOPS of single precision compute power and runs at a TDP of no more than 150 watts.

Compared to the existing Hawaii-based R9 390X, the upcoming R9 400 Polaris 10 series GPU has fewer shaders and less memory bandwidth. The memory is clocked 1 GHz higher, but the GDDR5X memory bus is half that of the 390X's 512-bit GDDR5 bus which results in 224 GB/s memory bandwidth for Polaris 10 versus 384 GB/s on Hawaii. The R9 390X has a slight edge in compute performance at 5.9 TFLOPS versus Polaris 10's 5.5 TFLOPS however the Polaris 10 GPU is using much less power and easily wins at performance per watt! It almost reaches the same level of single precision compute performance at nearly half the power which is impressive if it holds true!

  R9 390X R9 390 R9 380 R9 400-Series "Polaris 10"
GPU Code name Grenada (Hawaii) Grenada (Hawaii) Antigua (Tonga) Polaris 10
GPU Cores 2816 2560 1792 2048
Rated Clock 1050 MHz 1000 MHz 970 MHz ~1343 MHz
Texture Units 176 160 112 ?
ROP Units 64 64 32 ?
Memory 8GB 8GB 4GB 8GB
Memory Clock 6000 MHz 6000 MHz 5700 MHz 7000 MHz
Memory Interface 512-bit 512-bit 256-bit 256-bit
Memory Bandwidth 384 GB/s 384 GB/s 182.4 GB/s 224 GB/s
TDP 275 watts 275 watts 190 watts 150 watts (or less)
Peak Compute 5.9 TFLOPS 5.1 TFLOPS 3.48 TFLOPS 5.5 TFLOPS
MSRP (current) ~$400 ~$310 ~$199 $ unknown

Note: Polaris GPU clocks esitmated using assumption of 5.5 TFLOPS being peak compute and accurate number of shaders. (Thanks Scott.)

Another comparison that can be made is to the Radeon R9 380 which is a Tonga-based GPU with similar TDP. In this matchup, the Polaris 10 based chip will – at a slightly lower TDP – pack in more shaders, twice the amount of faster clocked memory with 23% more bandwidth, and provide a 58% increase in single precision compute horsepower. Not too shabby!

Likely, a good portion of these increases are made possible by the move to a smaller process node and utilizing FinFET "tri-gate" like transistors on the Samsung/Globalfoundries 14LPP FinFET manufacturing process, though AMD has also made some architecture tweaks and hardware additions to the GCN 4.0 based processors. A brief high level introduction is said to be made today in a webinar for their partners (though AMD has come out and said preemptively that no technical nitty-gritty details will be divulged yet). (Update: Tech Altar summarized the partner webinar. Unfortunately there was no major reveals other than that AMD will not be limiting AIB partners from pushing for the highest factory overclocks they can get).

Moving on from Polaris 10 for a bit, Polaris 11 is rumored to be a smaller GCN 4.0 chip that will top out at 14 CUs (estimated 896 shaders/stream processors) and 2.5 TFLOPS of single precision compute power. These chips aimed at mainstream and thin-and-light laptops will have 50W TDPs and will be paired with up to 4GB of GDDR5 memory. There is apparently no GDDR5X option for these, which makes sense at this price point and performance level. The 128-bit bus is a bit limiting, but this is a low end mobile chip we are talking about here...

  R7 370 R7 400 Series "Polaris 11"
GPU Code name Trinidad (Pitcairn) Polaris 11
GPU Cores 1024 896
Rated Clock

925 MHz base (975 MHz boost)

~1395 MHz
Texture Units 64 ?
ROP Units 32 ?
Memory 2 or 4GB 4GB
Memory Clock 5600 MHz ? MHz
Memory Interface 256-bit 128-bit
Memory Bandwidth 179.2 GB/s ? GB/s
TDP 110 watts 50 watts
Peak Compute 1.89 TFLOPS 2.5 TFLOPS
MSRP (current) ~$140 (less after rebates and sales) $?

Note: Polaris GPU clocks esitmated using assumption of 2.5 TFLOPS being peak compute and accurate number of shaders. (Thanks Scott.)

Fewer details were unveiled concerning Polaris 11, as you can see from the chart above. From what we know so far, it should be a promising successor to the R7 370 series even with the memory bus limitation and lower shader count as the GPU should be clocked higher, (it also might have more shaders in M series mobile variants versus of the 370 and lower mobile series) and a much lower TDP for at least equivalent if not a decent increase in performance. The lower power usage in particular will be hugely welcomed in mobile devices as it will result in longer battery life under the same workloads, ideally. I picked the R7 370 as the comparison as it has 4 gigabytes of memory and not that many more shaders and being a desktop chip readers may be more widely familiar with it. It also appears to sit between the R7 360 and R7 370 in terms of shader count and other features but is allegedly going to be faster than both of them while using at least (on paper) less than half the power.

Of course these are still rumors until AMD makes Polaris officially, well, official with a product launch. The claimed specifications appear reasonable though, and based on that there are a few important takeaways and thoughts I have.

amd-2016-polaris-blocks.jpg

The first thing on my mind is that AMD is taking an interesting direction here. While NVIDIA has chosen to start out its new generation at the top by announcing "big Pascal" GP100 and actually launching the GP104 GTX 1080 (one of its highest end consumer chips/cards) yesterday and then over the course of the year introducing lower end products AMD has opted for the opposite approach. AMD will be starting closer to the lower end with a mainstream notebook chip and high end notebook/mainstream desktop GPU (Polaris 11 and 10 respectively) and then over a year fleshing out its product stack (remember Raja Kudori stated Polaris and GCN 4 would be used across the entire product stack) and building up with bigger and higher end GPUs over time finally topping off with its highest end consumer (and professional) GPUs based on "Vega" in 2017.

This means, and I'm not sure if this was planned by either Nvidia or AMD or just how it happened to work out based on them following their own GPU philosophies (but I'm thinking the latter), that for some time after both architectures are launched AMD and NVIDIA's newest architectures and GPUs will not be directly competing with each other. Eventually they should meet in the middle (maybe late this year?) with a mid-range desktop graphics card and it will be interesting to see how they stack up at similar price points and hardware levels. Then, of course once "Vega" based GPUs hit (sadly probably in time for NV's big Pascal to launch heh. I'm not sure if Vega is Fury X replacement only or even beyond that to 1080Ti or even GP100 competitor) we should see GCN 4 on the new smaller process node square up against NVIDIA and it's 16nm Pascal products across the board (entire lineup). Which will have the better performance, which will win out in power usage and performance/watt and performance/$? All questions I wish I knew the answers to, but sadly do not!!

Speaking of price and performance/$... Polaris is actually looking pretty good so far at hitting much lower TDPs and power usage targets while delivering at least similar performance if not a good bit more. Both AMD and NVIDIA appear to be bringing out GPUs better than I expected to see as far as technological improvements in performance and power usage (these die shrinks have really helped even though from here on out that trend isn't really going to continue...). I hope that AMD can at least match NV in these areas at the mid range even if they do not have a high end GPU coming out soon (not until sometime after these cards launch and not really until Vega, the high end GCN GPU successor). At least on paper based on the leaked information the GPUs so far look good. My only worry is going to be pricing which I think is going to make or break these cards. AMD will need to price them competitively and aggressively to ensure their adoption and success.  

I hope that doing the rollout this way (starting with lower end chips) helps AMD to iron out the new smaller process node and that they are able to get good yields so that they can be aggressive with pricing here and eventually at the hgh end!

I am looking forward to more information on AMD's Polaris architecture and the graphics cards based on it!

Also read:

I will admit that I am not 100% up on all the rumors and I apologize for that. With that said, I would love to hear what your thoughts are on AMD's upcoming GPUs and what you think about these latest rumors!

NVIDIA Releases Full Specifications for GTX 1070

Subject: Graphics Cards | May 18, 2016 - 04:49 PM |
Tagged: nvidia, pascal, gtx 1070, 1070, gtx, GTX 1080, 16nm FF+, TSMC, Founder's Edition

Several weeks ago when NVIDIA announced the new GTX 1000 series of products, we were given a quick glimpse of the GTX 1070.  This upper-midrange card is to carry a $379 price tag in retail form while the "Founder's Edition" will hit the $449 mark.  Today NVIDIA released the full specifications of this card on their website.

The interest of the GTX 1070 is incredibly great because of the potential performance of this card vs. the previous generation.  Price is also a big consideration here as it is far easier to raise $370 than it is to make the jump to GTX 1080 and shell out $599 once non-Founder's Edition cards are released.  The GTX 1070 has all of the same features as the GTX 1080, but it takes a hit when it comes to clockspeed and shader units.

gtx_1070_launch.png

The GTX 1070 is a Pascal based part that is fabricated on TSMC's 16nm FF+ node.  It shares the same overall transistor count of the GTX 1080, but it is partially disabled.  The GTX 1070 contains 1920 CUDA cores as compared to the 2560 cores of the 1080.  Essentially one full GPC is disabled to reach that number.  The clockspeeds take a hit as well compared to the full GTX 1080.  The base clock for the 1070 is still an impressive 1506 MHz and boost reaches 1683 MHz.  This combination of shader counts and clockspeed makes this probably a little bit faster than the older GTX 980 ti.  The rated TDP for the card is 150 watts with a single 8 pin PCI-E power connector.  This means that there should be some decent headroom when it comes to overclocking this card.  Due to binning and yields, we may not see 2+ GHz overclocks with these cards, especially if NVIDIA cut down the power delivery system as compared to the GTX 1080.  Time will tell on that one.

The memory technology that NVIDIA is using for this card is not the cutting edge GDDR5x or HBM, but rather the tried and true GDDR5.  8 GB of this memory sits on a 256 bit bus, but it is running at a very, very fast 8 gbps.  This gives overall bandwidth in the 256 GB/sec region.  When we combine this figure with the memory compression techniques implemented with the Pascal architecture we can see that the GTX 1070 will not be bandwidth starved.  We have no information if this generation of products will mirror what we saw with the previous generation GTX 970 in terms of disabled memory controllers and the 3.5 GB/500 MB memory split due to that unique memory subsystem.

gtx_1070_launch2.png

Beyond those things, the GTX 1070 is identical to the GTX 1080 in terms of DirectX features, display specifications, decoding support, double bandwidth SLI, etc.  There is an obvious amount of excitement for this card considering its potential performance and price point.  These supposedly will be available in the Founder's Edition release on June 10 for the $449 MSRP.  I know many people are considering using these cards in SLI to deliver performance for half the price of last year's GTX 980ti.  From all indications, these cards will be a signficant upgrade for anyone using GTX 970s in SLI.  With the greater access to monitors that hit 4K as well as Surround Gaming, this could be a solid purchase for anyone looking to step up their game in these scenarios.

Source: NVIDIA

Simply FUD or a message from the Forced Upgrade Department?

Subject: General Tech | May 18, 2016 - 04:44 PM |
Tagged: Intel, microsoft, fud

DigiTimes has a doozy of a post title, stating that Intel plans to limit OS support on future processors starting with Kaby Lake and Apollo Lake CPUs.  Now this sounds horrible but you may be taking the word support out of context as it refers to the support that major customers require which leads to the so called errata (pdf example), not that the processors will be incapable of running any OS but Windows 10.  This may not matter so much to the average consumer but for industries and the scientific community this could result in huge costs as they would no longer be able to get fixes from Intel, unless they have upgraded to Windows 10.   That upgrade comes with its own costs, the monstrous amount of time it will take for compatibility testing, application updating and implementation; not to mention licensing fees.

AMD should take note of this, focus on continued legacy support and most importantly advertising that fact.  The price difference between choosing AMD over Intel could become even more compelling for these large customers and help refill AMD's coffers.

Opportunity.jpg

"With Intel planning to have its next-generation processors support only Windows 10, industrial PC (IPC) players are concerned that the move will dramatically increase their costs and affect market demand, according to sources from IPC players."

Here is some more Tech News from around the web:

Tech Talk

Source: DigiTimes
Author:
Subject: Processors
Manufacturer: ARM

10nm Sooner Than Expected?

It seems only yesterday that we had the first major GPU released on 16nm FF+ and now we are talking about ARM about to receive their first 10nm FF test chips!  Well, in fact it was yesterday that NVIDIA formally released performance figures on the latest GeForce GTX 1080 which is based on TSMC’s 16nm FF+ process technology.  Currently TSMC is going full bore on their latest process node and producing the fastest current graphics chip around.  It has taken the foundry industry as a whole a lot longer to develop FinFET technology than expected, but now that they have that piece of the puzzle seemingly mastered they are moving to a new process node at an accelerated rate.

arm_td01.png

TSMC’s 10nm FF is not well understood by press and analysts yet, but we gather that it is more of a marketing term than a true drop to 10 nm features.  Intel has yet to get past 14nm and does not expect 10 nm production until well into next year.  TSMC is promising their version in the second half of 2016.  We cannot assume that TSMC’s version will match what Intel will be doing in terms of geometries and electrical characteristics, but we do know that it is a step past TSMC’s 16nm FF products.  Lithography will likely get a boost with triple patterning exposure.  My guess is that the back end will also move away from the “20nm metal” stages that we see with 16nm.  All in all, it should be an improved product from what we see with 16nm, but time will tell if it can match the performance and density of competing lines that bear the 10nm name from Intel, Samsung, and GLOBALFOUNDRIES.

ARM has a history of porting their architectures to new process nodes, but they are being a bit more aggressive here than we have seen in the past.  It used to be that ARM would announce a new core or technology, and it would take up to two years to be introduced into the market.  Now we are seeing technology announcements and actual products hitting the scenes about nine months later.  With the mobile market continuing to grow we expect to see products quicker to market still.

arm_td02.png

The company designed a simplified test chip to tape out and send to TSMC for test production on the aforementioned 10nm FF process.  The chip was taped out in December, 2015.  The design was shipped to TSMC for mask production and wafer starts.  ARM is expecting the finished wafers to arrive this month.

Click here to continue reading about ARM's test foray into 10nm!

The 1080 roundup, Pascal in all its glory

Subject: Graphics Cards | May 17, 2016 - 10:22 PM |
Tagged: nvidia, pascal, video, GTX 1080, gtx, GP104, geforce, founders edition

Yes that's right, if you felt Ryan and Al somehow missed something in our review of the new GTX 1080 or you felt the obvious pro-Matrox bios was showing here are the other reviews you can pick and choose from.  Start off with [H]ard|OCP who also tested Ashes of the Singularity and Doom as well as the old favourite Battlefield 4.  Doom really showed itself off as a next generation game, its Nightmare mode scoffing at any GPU with less than 5GB of VRAM available and pushing the single 1080 hard.  Read on to see how the competition stacked up ... or wait for the 1440 to come out some time in the future.

1463427458xepmrLV68z_1_8_l.jpg

"NVIDIA's next generation video card is here, the GeForce GTX 1080 Founders Edition video card based on the new Pascal architecture will be explored. We will compare it against the GeForce GTX 980 Ti and Radeon R9 Fury X in many games to find out what it is capable of."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Crucial Announces Ballistix Sport LT DDR4 SODIMMs

Subject: Memory | May 17, 2016 - 04:09 PM |
Tagged: sodimm, ddr4, crucial ballistix sport

Crucial is releasing some new high end memory for gaming laptops and for those mobile devices which work for a living.  The new Ballistix Sport LT DDR4 SODIMMs will start at speeds of 2400 MT/s and will be fully Intel XMP compatible assuming you system beleives in those DDR4 speeds; if not look for an update from the manufacturer.  The SODIMMs will be available in sizes of up to 16GB per DIMM so you should be able to install quite a large pool of memory.  They didn't offer up any pictures as this was being written but instead a Youtube video of how Ballistix memory is made, which you can see below.

Boise, ID, and Glasgow, UK, -- May 17, 2016 – Crucial, a leading global brand of memory and storage upgrades, today announced the availability of Ballistix® Sport LT DDR4 SODIMMs. Ideal for gamers and performance enthusiasts, the new modules accelerate gaming laptops and small form factor systems by packing faster speeds into every memory slot, enabling users to run demanding games and applications with ease.

With speeds starting at 2400 MT/s, Ballistix Sport LT SODIMMs offer better latencies, reduced load times, and improved frame rates with integrated graphics. The new modules also feature a sleek black PCB and digital camo design and support Intel® XMP 2.0 profiles for easy installation.

“We’re constantly seeking ways to empower gamers with affordable, easy-to-use products that help them gain that competitive, performance edge,” explained Jeremy Mortenson, product marketing manager, Crucial. “With new platforms supporting faster DDR4 SODIMMS coming to the market, the newest Ballistix SODIMM module does just that.”

The Ballistix Sport LT DDR4 SODIMM modules will be available for purchase at www.crucial.com and through select global partners. All Crucial memory is backed by a limited lifetime warranty Limited lifetime warranty valid everywhere except Germany, where warranty is valid for 10 years from date of purchase.

For more information about Ballistix memory, visit crucial.com/ballistix.

Source: Crucial