Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »
Author:
Manufacturer: AMD

Retail cards still suffer from the issue

In our review of AMD's latest flagship graphics card, the Radeon R9 Fury X, I noticed and commented on the unique sound that the card was producing during our testing. A high pitched whine, emanating from the pump of the self-contained water cooler designed by Cooler Master, was obvious from the moment our test system was powered on and remained constant during use. I talked with a couple of other reviewers about the issue before the launch of the card and it seemed that I wasn't alone. Looking around other reviews of the Fury X, most make mention of this squeal specifically.

09.jpg

Noise from graphics cards come in many forms. There is the most obvious and common noise from on-board fans and the air it moves. Less frequently, but distinctly, the sound of inductor coil whine comes up. Fan noise spikes when the GPU gets hot, causing the fans to need to spin faster and move more air across the heatsink, which keeps everything running cool. Coil whine changes pitch based on the frame rate (and the frequency of power delivery on the card) and can be alleviated by using higher quality components on the board itself.

But the sound of our Fury X was unique: it was caused by the pump itself and it was constant. The noise it produced did not change as the load on the GPU varied. It was also 'pitchy' - a whine that seemed to pierce through other sounds in the office. A close analog might be the sound of an older, CRT TV or monitor that is left powered on without input.

In our review process, AMD told us the solution was fixed. In an email sent to the media just prior to the Fury X launch, an AMD rep stated:

In regards to the “pump whine”, AMD received feedback that during open bench testing some cards emit a mild “whining” noise.  This is normal for most high speed liquid cooling pumps; Usually the end user cannot hear the noise as the pumps are installed in the chassis, and the radiator fan is louder than the pump.  Since the AMD Radeon™ R9 Fury X radiator fan is near silent, this pump noise is more noticeable.  
 
The issue is limited to a very small batch of initial production samples and we have worked with the manufacturer to improve the acoustic profile of the pump.  This problem has been resolved and a fix added to production parts and is not an issue.

I would disagree that this is "normal" but even so, taking AMD at its word, I wrote that we heard the noise but also that AMD had claimed to have addressed it. Other reviewers noted the same comment from AMD, saying the result was fixed. But very quickly after launch some users were posting videos on YouTube and on forums with the same (or worse) sounds and noise. We had already started bringing in a pair of additional Fury X retail cards from Newegg in order to do some performance testing, so it seemed like a logical next step for us to test these retail cards in terms of pump noise as well.

First, let's get the bad news out of the way: both of the retail AMD Radeon R9 Fury X cards that arrived in our offices exhibit 'worse' noise, in the form of both whining and buzzing, compared to our review sample. In this write up, I'll attempt to showcase the noise profile of the three Fury X cards in our possession, as well as how they compare to the Radeon R9 295X2 (another water cooled card) and the GeForce GTX 980 Ti reference design - added for comparison.

Continue reading our look into the pump noise of the AMD Fury X Graphics Card!

Subject: Systems
Manufacturer: PC Perspective
Tagged: quad-core, gpu, gaming, cpu

Introduction and Test Hardware

logos.jpg

The PC gaming world has become divided by two distinct types of games: those that were designed and programmed specifically for the PC, and console ports. Unfortunately for PC gamers it seems that far too many titles are simply ported over (or at least optimized for consoles first) these days, and while PC users can usually enjoy higher detail levels and unlocked frame rates there is now the issue of processor core-count to consider. This may seem artificial, but in recent months quite a few games have been released that require at least a quad-core CPU to even run (without modifying the game).

One possible explanation for this is current console hardware: PS4 and Xbox One systems are based on multi-core AMD APUs (the 8-core AMD "Jaguar"). While a quad-core (or higher) processor might not be techincally required to run current games on PCs, the fact that these exist on consoles might help to explain quad-core CPU as a minimum spec. This trend could simply be the result of current x86 console hardware, as developement of console versions of games is often prioritized (and porting has become common for development of PC versions of games). So it is that popular dual-core processors like the $69 Intel Pentium Anniversary Edition (G3258) are suddenly less viable for a future-proofed gaming build. While hacking these games might make dual-core CPUs work, and might be the only way to get such a game to even load as the CPU is checked at launch, this is obviously far from ideal.

4790K_box.jpg

Is this much CPU really necessary?

Rather than rail against this quad-core trend and question its necessity, I decided instead to see just how much of a difference the processor alone might make with some game benchmarks. This quickly escalated into more and more system configurations as I accumulated parts, eventually arriving at 36 different configurations at various price points. Yeah, I said 36. (Remember that Budget Gaming Shootout article from last year? It's bigger than that!) Some of the charts that follow are really long (you've been warned), and there’s a lot of information to parse here. I wanted this to be as fair as possible, so there is a theme to the component selection. I started with three processors each (low, mid, and high price) from AMD and Intel, and then three graphics cards (again, low, mid, and high price) from AMD and NVIDIA.

Here’s the component rundown with current pricing*:

Processors tested:

Graphics cards tested:

  • AMD Radeon R7 260X (ASUS 2GB OC) - $137.24
  • AMD Radeon R9 280 (Sapphire Dual-X) - $169.99
  • AMD Radeon R9 290X (MSI Lightning) - $399
  • NVIDIA GeForce GTX 750 Ti (OEM) - $149.99
  • NVIDIA GeForce GTX 770 (OEM) - $235
  • NVIDIA GeForce GTX 980 (ASUS STRIX) - $519

*These prices were current as of  6/29/15, and of course fluctuate.

Continue reading our Quad-Core Gaming Roundup: How Much CPU Do You Really Need?

New AMD Fury X Pumps May Reduce Noise Levels

Subject: Graphics Cards | July 2, 2015 - 02:03 PM |
Tagged: amd, radeon, fury x, pump whine

According to a couple of users from the Anandtech forums and others, there is another wave of AMD Fury X cards making their way out into the world. Opening up the top of the Fury X card to reveal the Cooler Master built water cooler pump, there are two different configurations in circulation. One has a teal and white Cooler Master sticker, the second one has a shiny CM logo embossed on it.

fury-new-pump.jpg

This is apparently a different pump implementation than we have seen thus far.

You might have read our recent story looking at the review sample as well as two retail purchased Fury X cards where we discovered that the initial pump whine and noise that AMD claimed would be gone, in fact remained to pester gamers. As it turns out, all three of our cards have the teal/white CM logo.

threefuryxcards.jpg

Our three Fury X cards have the same sticker on them.

Based on at least a couple of user reports, this different pump variation does not have the same level of pump whine that we have seen to date. If that's the case, it's great news - AMD has started pushing out Fury X cards to the retail market that don't whine and squeal!

If this sticker/label difference is in fact the indicator for a newer, quieter pump, it does leave us with a few questions. Do current Fury X owners with louder coolers get to exchange them through RMA? Is it possible that these new pump decals are not indicative of a total pump change over and this is just chance? I have asked AMD for details on this new information already, and in fact have been asking for AMD's input on the issue since the day of retail release. So far, no one has wanted to comment on it publicly or offer me any direction as to what is changing and when.

I hope for the gamers' sake that this new pump sticker somehow will be the tell-tale sign that you have a changed cooler implementation. Unfortunately for now, the only way to know if you are buying one of these is to install it in your system and listen or to wait for it to arrive and take the lid off the Fury X. (It's a Hex 1.5 screw by the way.)

Though our budget is more than slightly stretched, I'm keeping an eye out for more Fury X cards to show up for sale to get some more random samples in-house!

Source: Fudzilla
Subject: Storage
Manufacturer: Samsung

Introduction, Specifications and Packaging

Introduction:

Where are all the 2TB SSDs? It's a question we've been hearing since they started to go mainstream seven years ago. While we have seen a few come along on the enterprise side as far back as 2011, those were prohibitively large, expensive, and out of reach of most consumers. Part of the problem initially was one of packaging. Flash dies simply were not of sufficient data capacity (and could not be stacked in sufficient quantities) as to reach 2TB in a consumer friendly form factor. We have been getting close lately, with many consumer focused 2.5" SATA products reaching 1TB, but things stagnated there for a bit. Samsung launched their 850 EVO and Pro in capacities up to 1TB, with plenty of additional space inside the 2.5" housing, so it stood to reason that the packaging limit was no longer an issue, so why did they keep waiting?

The first answer is one of market demand. When SSDs were pushing $1/GB, the thought of a 2TB SSD was great right up to the point where you did the math and realized it would cost more than a typical enthusiast-grade PC. That was just a tough pill to swallow, and market projections showed it would take more work to produce and market the additional SKU than it would make back in profits.

The second answer is one of horsepower. No, this isn't so much a car analogy as it is simple physics. 1TB SSDs had previously been pushing the limits of controller capabilities of flash and RAM addressing, as well as handling Flash Translation Layer lookups as well as garbage collection and other duties. This means that doubling a given model SSD capacity is not as simple as doubling the amount of flash attached to the controller - that controller must be able to effectively handle twice the load.

With all of that said, it looks like we can finally stop asking for those 2TB consumer SSDs, because Samsung has decided to be the first to push into this space:

150705-191310.jpg

Today we will take a look at the freshly launched 2TB version of the Samsung 850 EVO and 850 Pro. We will put these through the same tests performed on the smaller capacity models. Our hope is to verify that the necessary changes Samsung made to the controller are sufficient to keep performance scaling or at least on-par with the 1TB and smaller models of the same product lines.

Read on for the full review!

Report: AMD Radeon Fury Specs and Photos Leaked

Subject: Graphics Cards | July 7, 2015 - 11:59 AM |
Tagged: Radeon Fury, radeon, HBM1, amd

As reported by VideoCardz.com the upcoming Radeon Fury card specs have been leaked (and confirmed, according to the report), and the air-cooled card is said to have 8 fewer compute units enabled and a slightly slower core clock.

fury_screen_1.PNG

The report pictures a pair of Sapphire cards, both using the Tri-X triple-fan air cooler. The first is a reference-clocked version which will be 1000 MHz (50 Hz slower than the Fury X), and an overclocked version at 1040 MHz. And what of the rest of the specs? VideoCardz has created this table:

fury_screen.PNG

The total number of compute units is 56 (8 fewer than the Fury X), which at 64 stream cores per unit results in 3584 for the non-X GPU. TMU count drops to 224, and HBM1 memory speed is unchanged at 1000 MHz effective. VideoCardz is listing the ROP count at an unchanged 64, but this (along with the rest of the report, of course) has not been officially announced.

fury_01.PNG

The board will apparently be identical to the reference Fury X

Retail price on this card had been announced by AMD as $549, and with the modest reduction in specs (and hopefully some overclocking headroom) this could be an attractive option to compete with the GTX 980, though it will probably need to beat the 980's performance or at least match its $500 price to be relevant in the current market. With these specs it looks like it will only be slightly behind the Fury X so pricing shouldn't be much of an issue for AMD just yet.

AMD Projects Decreased Revenue by 8% for Q2 2015

Subject: Graphics Cards, Processors | July 7, 2015 - 08:00 AM |
Tagged: earnings, amd

The projections for AMD's second fiscal quarter had revenue somewhere between flat and down 6%. The actual estimate, as of July 6th, is actually below the entire range. They expect that revenue is down 8% from the previous quarter, rather than the aforementioned 0 to 6%. This is attributed to weaker APU sales in OEM devices, but they also claim that channel sales are in line with projections.

amd-new2.png

This is disappointing news for fans of AMD, of course. The next two quarters will be more telling though. Q3 will count two of the launch months for Windows 10, which will likely include a bunch of new and interesting devices and aligns well with back to school season. We then get one more chance at a pleasant surprise in the fourth quarter and its holiday season, too. My intuition is that it won't be too much better than however Q3 ends up.

One extra note: AMD has also announced a “one-time charge” of $33 million USD related to a change in product roadmap. Rather than releasing designs at 20nm, they have scrapped those plans and will architect them for “the leading-edge FinFET node”. This might be a small expense compared to how much smaller the process technology will become. Intel is at 14nm and will likely be there for some time. Now AMD doesn't need to wait around at 20nm in the same duration.

Source: AMD

Is your game library getting huge? Maybe a 2TB SSD is the answer

Subject: Storage | July 6, 2015 - 03:28 PM |
Tagged: ssd, Samsung, 850 PRO, 850 EVO, 2TB

Samsung is extending their 850 EVO and Pro lineups to include 2TB versions of the popular SSDs thanks to the use of 3D-VNAND; three bit memory on the EVO and two bit on the Pro.  They are rated at the same speeds as their 500GB and above counterparts and The SSD Review had a chance to test that. Interestingly they did indeed find performance differences between the 1TB and 2TB model of the same design, which you can check out in the full review.  Their results were not quite the same as Al's review which was just posted, you should compare the two reviews as well as the systems used for theories on why that is.  You can expect to pay ~$1000 for the 850 Pro 2TB and ~$800 for the 850 EVO 2TB.

Samsung-Pro-and-EVO-2TB-SSD-Exterior-Cases.png

"If you look back over the past several years, there have always been three constants that needed to be addressed in order for SSDs to become a viable consumer solution to storage; value, reliability and capacity. One of our first SSD reviews was on an MTron 32GB SSD with a whopping price tag of more than $1500…and they sold!"

Here are some more Storage reviews from around the web:

Storage

Introduction and Technical Specifications

Introduction

In our previous article here, we demonstrated how to mod the EVGA GTX 970 SC ACX 2.0 video card to get higher performance and significantly lower running temps. Now we decided to take two of these custom modded EVGA GTX 970 cards to see how well they perform in an SLI configuration. ASUS was kind enough to supply us with one of their newly introduced ROG Enthusiast SLI Bridges for our experiments.

ASUS ROG Enthusiast SLI Bridge

02-rog-3way-adapter.jpg

Courtesy of ASUS

03-rog-adapter-profile.jpg

Courtesy of ASUS

For the purposes of running the two EVGA GTX 970 SC ACX 2.0 video cards in SLI, we chose to use the 3-way variant of ASUS' ROG Enthusiast SLI Bridge so that we could run the tests with full 16x bandwidth across both cards (with the cards in PCIe 3.0 x16 slots 1 and 3 in our test board). This customized SLI adapter features a powered red-colored ROG logo embedded in its brushed aluminum upper surface. The adapter supports 2-way and 3-way SLI in a variety of board configurations.

04-all-adapters.jpg

Courtesy of ASUS

ASUS offers their ROG Enthusiast SLI Bridge in 3 sizes for various variations on 2-way, 3-way, and 4-way SLI configurations. All bridges feature the top brushed-aluminum cap with embedded glowing ROG logo.

Continue reading our article on Modding the EVGA GTX 970 SC Graphics Card!

05-sli-config.jpg

Courtesy of ASUS

The smallest bridge supports 2-way SLI configurations with either a two or three slot separation. The middle sized bridge supports up to a 3-way SLI configuration with a two slot separation required between each card. The largest bridge support up to a 4-way SLI configuration, also requiring a two slot separation between each card used.

Technical Specifications (taken from the ASUS website)

Dimensions 2-WAY: 97 x 43 x 21 (L x W x H mm)
3-WAY: 108 x 53 x 21 (L x W x H mm)
4-WAY: 140 x 53 x 21 (L x W x H mm)
Weight 70 g (2-WAY)
91 g (3-WAY)
123 g(4-WAY)
Compatible GPU set-ups 2-WAY: 2-WAY-S & 2-WAY-M
3-WAY: 2-WAY-L & 3-WAY
4-WAY: 4-WAY
Contents 2-WAY: 1 x optional power cable & 2 PCBs included for varying configurations
3-WAY: 1 x optional power cable
4-WAY: 1 x optional power cable

Continue reading our story!!

Overclocking the R9 390X

Subject: Graphics Cards | July 6, 2015 - 01:58 PM |
Tagged: amd, r9 390x, overclocking

Now that [H]ard|OCP has had more time to spend with the new R9 390X they have managed to find the overclocks that they are most comfortable running on the card they used to test.  They used MSI Afterburner 4.1.1 and first overclocked the card without changing voltages at all, which netted them 1150MHz core and 6.6GHz effective on the RAM.  From there they started to raise to Core Voltage, eventually settling on +50 as settings higher than that resulted in lower maximum observed voltages due to the TDP being reached and the card throttling back.  With that voltage setting they could get the card to run at 1180MHz, with the memory speed remaining at 6.6GHz as it is not effected by the core voltage settings, with the fan speed set 80% they saw a consistent 67C GPU temperature.  How much impact did that have on performance and could it push the card's performance beyond an overclocked GTX 980?  Read the full review to find out in detail.

msi-r9-390x.jpg

"We take the new MSI Radeon R9 390X GAMING 8G video card and overclock it to it fullest and compare it with an overclocked GeForce GTX 980 at 1440p and 4K in today's latest games. Find out how much overclocking the R9 390X improves performance, and which video card is best performing. Can R9 390X overclock better than R9 290X?"

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Small in stature, big in performance; CyberPower's Infinity Xtreme Cube

Subject: Systems | June 30, 2015 - 07:23 PM |
Tagged: Infinity Xtreme Cube, Cyberpower

The impressively name Infinity Xtreme Cube from CyberPower is a rather impressive machine and not just because of their use of a 400GB Intel 750 M.2 PCIe SSD for storage.  The system is built on a Gigabyte X99M-Gaming 5 with an i7-5820K processor, 16GB of HyperX DDR4-2400 in quad channel and a GTX 970 for video, not to mention the pair of 1TB HDDs in RAID0 for long term storage.  The components are housed in a Corsair Air 240 case 470x343x381mm (18.5x13.5x15") in size, not the easiest case to install your components in which makes it nice that someone does it for you.  You pay for the configuration and three year warranty but for those who want a working system to arrive at their door this review at Kitguru is worth looking at.  Hopefully based on the review CyberPower will make a slight change to the UEFI settings in future, changing the PCIe slot Configuration from AUTO to GEN3.

AS7V0517.jpg

"Today we look at a powerful, yet diminutive new system from UK system builder CyberPower called the Infinity Xtreme Cube. This system is built around the Gigabyte X99M-Gaming 5 motherboard – installed inside the tiny Corsair Air 240 chassis."

Here are some more Systems articles from around the web:

Systems

Source: KitGuru

My Take on July 29th Reservations

Subject: General Tech | July 5, 2015 - 06:00 PM |
Tagged: windows 10, microsoft

A couple of days ago, Paul Thurrott wrote an editorial about Microsoft's Windows 10 reservation system. His point was that, while Microsoft claimed users of Windows 7 or 8.1 could upgrade on July 29th, they might not get it until later. Upgrades will start rolling out on the 29th of July, but the actual queue is expected to take several days. According to Microsoft's blog post, which shows blatant disrespect for the Oxford Comma, “Each day of the roll-out, we will listen, learn and update the experience for all Windows 10 users.”

windowsupdate.png

Paul linked this backtrack to an episode of Seinfeld, one where Jerry reserves a rental car; his reservation was held, but a car was not. He stated that the availability date was clearly stated as July 29th, and not everyone will get it then. I can see his point, and I agree with it. Microsoft really should provide what they claim on the date that they claim it.

On the other hand, it is possible that Microsoft saw the whole reservation system as reserving your spot in line. That is, it might be that upgrade requests will be processed in reservation order, at least mostly, when devices are available. I imagine a “take a number” system where slots will be assigned for anyone below a threshold that increases as upgrades are fulfilled. Again, this is hypothetical, but I cannot really see any other reason for a reservation system in the first place, apart from pure marketing.

Either way, some may need to wait until after July 29th to experience Windows 10, and Microsoft botched their announcement.

Source: Thurrott.com

Podcast #356 - Fury X Pump Whine, ASUS MG279Q FreeSync Monitor, GTX 980 Ti STRIX and more!

Subject: General Tech | July 2, 2015 - 02:39 PM |
Tagged: podcast, video, fury x, pump whine, asus, mg279q, freesync, strix 980ti, gtx 980ti, seasonic, snow silent, zotac, zbox

PC Perspective Podcast #356 - 07/02/2015

Join us this week as we discuss the Fury X Pump Whine, ASUS MG279Q FreeSync Monitor, GTX 980 Ti STRIX and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

Zotac's GTX 980Ti AMP! Extreme Is A Factory Overclocked Monster

Subject: Graphics Cards | July 4, 2015 - 02:39 PM |
Tagged: zotac, maxwell, gtx 980ti, factory overclocked

Zotac recently unleashed a monstrous new GTX 980Ti AMP! Extreme graphics card featuring a giant triple slot cooler and a very respectable factory overclock.

Specifically, the Zotac ZT-90505-10P card is a custom card with a factory overclocked NVIDIA GTX 980Ti GPU and GDDR5 memory. The card is a triple slot design that uses a dual fin stack IceStorm heatsink with three 90mm temperature controlled EKO fans. The cooler wraps the fans and HSF in a shroud and also uses a backplate on the bottom of the card. The card is powered by two 8-pin PCI-E power connectors and display outputs include three DisplayPort, one HDMI, and one DL-DVI.

Zotac ZT-90505-10P GTX 980Ti Amp Extreme Graphics Card.jpg

Zotac was able to push the Maxwell GPU with its 2,816 CUDA cores to 1,253 MHz base and 1,355 MHz boost. Further, the 6GB GDDR5 memory also has a factory overclock of 7,220 MHz. These clockspeeds are a decent bump over the reference speeds of 1,000 MHz GPU base, 1,076 MHz GPU boost, and 7,012 MHz memory.

We’ll have to wait for reviews to know for sure, but on paper this card looks to be a nice card that should run fast and cool thanks to that triple fan cooler. The ZT-90505-10P will be available shortly with an MSRP of $700 and a 2 year warranty.

Definitely not a bad price compared to other GTX 980Ti cards on the market.

Source: Zotac

Microsoft Releases Several Windows 10 Builds This Week

Subject: General Tech | July 5, 2015 - 04:20 PM |
Tagged: windows 10, microsoft

Early this week, Microsoft released a pair of new builds into the Windows Insider Fast Ring. Back to back, Build 10158 was released on Monday and 10159 followed it on Tuesday. These two updates fixed several hundred bugs, officially branded Project Spartan as Microsoft Edge, introduced the new default wallpaper to the desktop and lock screen, and tweaked a few more interface elements since 10130. After an uneventful Wednesday, Build 10162 arrived on Thursday with ISOs released later that evening, which was great for me because I couldn't get the build through Windows Update. Sad face.

windows-10.png

I was a Slow Ring user for the last few releases, and I honestly intend to continue with that pace going forward. This is my production machine, but switching to Fast was tempting in hopes that the new build would fix the few problems that I had. Namely, StarCraft II was flickering terribly since 10074 when played in windowed mode. Thankfully, StarCraft II can reliably alt+tab without crashing, but it excludes playing a slow-paced Arcade mod in another monitor while doing something else. Mount & Blade: Warband had similar issues, especially when the monitor and game are set to 120 Hz. It seems to be just DirectX 9 titles, too. Either way, they are still unfixed for me. Some of our viewers may want to know my experience.

microsoft-2015-windows-10-10159-upgrade.png

The first thing that I noticed was a seemingly new upgrade screen between asking to reboot and actually rebooting. This was something that I only remember experiencing with Windows Updates, not whole new Windows builds. Perhaps this was a big one for some reason? It did try to install an anti-malware definition alongside it, so maybe it was just a weird interaction between Windows Update and the Windows 10 in-place build upgrade. Maybe it's something new though.

microsoft-2015-windows-10-10159-lock.jpg

The lock screen is the next obvious change. It contains the new Windows branding that was announced a couple of weeks ago. The slanted window was made out of glass, fog, and projected light. Even though it fits the previous branding, Microsoft made a big deal out of it.

The major change occurs once logged in. Microsoft Edge is no longer referred to as “Project Spartan”, and it is basically a full-fledged web browser now. Its performance is great, and it is nice to see the light at the end of the tunnel when it comes to browser compatibility. I do feel that the interface is kind-of ugly, though. Granted, the soft fonts are probably easier to scale between high and low DPI monitors, but I would prefer something more crisp. Likewise, the big, featureless, rectangular UI elements are likely a compromise for touch displays, but I've always thought they were placeholder during development builds. Then again, I find basically every browser to be bland, so there's that.

microsoft-2015-windows-10-10162-notify.jpg

Other UI elements were altered as well. For instance, while I don't pay too much attention to elements in the notification tray, I am pretty sure that Quiet Hours and the OneNote shortcut are new. While “Note” is obvious, it opens OneNote, Quiet Hours apparently gives a toggle to disable notifications. This is not a new feature, dating back to Windows 8 and Windows Phone apparently, but it has a new home in the notification area.

We're getting close to the July 29th “release” date and might see several builds before then, too. Builds are mostly merging work into a stable core at this point. According to BuildFeed, fbl_impressive, the branch of Windows 10 that is given to Windows Insiders, is up to build 10164, which was created on July 1st. We're not going to see every build of course, some are destined to partners for instance, but the distance between QA-approved builds is shrinking. Unless something is broken that you hope Microsoft will fix or you can afford the time to upgrade, it might be useful to switch to slow until launch. You could always flip to Fast if something cool comes up, although there is sometimes a lag before Windows Update changes your branch if you do that.

Source: Microsoft

Almost NoScript Exploits Whitelist Vulnerabilities

Subject: General Tech | July 6, 2015 - 07:01 AM |
Tagged: noscript, javascript, firefox

I do not really believe in disabling JavaScript, although the ability to control or halt execution would be nice, but you can use an extension to remove it entirely if you want. I say this because the upcoming story talks about vulnerabilities in the NoScript extension, which locks down JavaScript and other, non-static content. By “vulnerabilities”, we mean the ability to execute JavaScript, which every major browser vendor defaults on because they consider it safe for their users on its own.

NoScript.png

This is like a five-year-old figuring out how to unlock a fireworks case full of paper crackers.

Regardless, there are two vulnerabilities, both of which have already been updated. Both of them take advantage of the whitelist functionality to ignore malicious code. By default, NoScript trusts a handful of domains, because blocking every script ever would break too much of the internet.

The first problem is that the whitelist has a little cruft, some of which including domain names that are useless, and even some that have expired into the public domain for sale. To prove a point, Matthew Bryant purchased zendcdn.net and used it to serve his own JavaScript. The second problem is similar, but slightly different. Rather than finding a domain that expired, it found some whitelist entries, such as googleapis.com, that had sub-domains, storage.googleapis.com, which is a service that accepts untrusted user scripts (it is part of Google's Cloud Platform).

Again, even though JavaScript is about as secure as you can get in an executable language, you should be allowed to control what executes on your machine. As stated, NoScript has already addressed these issues in a recent update.

Tick Tock Tick Tock Tick Tock Tock

A few websites have been re-reporting on a leak from BenchLife.info about Kaby Lake, which is supposedly a second 14nm redesign (“Tock”) to be injected between Skylake and Cannonlake.

UPDATE (July 2nd, 3:20pm ET): It has been pointed out that many hoaxes have come out of the same source, and that I should be more clear in my disclaimer. This is an unconfirmed, relatively easy to fake leak that does not have a second, independent source. I reported on it because (apart from being interesting enough) some details were listed on the images, but not highlighted in the leak, such as "GT0" and a lack of Iris Pro on -K. That suggests that the leaker got the images from somewhere, but didn't notice those details, which implies that the original source was hoaxed by an anonymous source, who only seeded the hoax to a single media outlet, or that it was an actual leak.

Either way, enjoy my analysis but realize that this is a single, unconfirmed source who allegedly published hoaxes in the past.

intel-2015-kaby-lake-leak-01.png

Image Credit: BenchLife.info

If true, this would be a major shift in both Intel's current roadmap as well as how they justify their research strategies. It also includes a rough stack of product categories, from 4.5W up to 91W TDPs, including their planned integrated graphics configurations. This leads to a pair of interesting stories:

How Kaby Lake could affect Intel's processors going forward. Since 2006, Intel has only budgeted a single CPU architecture redesign for any given fabrication process node. Taking two attempts on the 14nm process buys time for 10nm to become viable, but it could also give them more time to build up a better library of circuit elements, allowing them to assemble better processors in the future.

What type of user will be given Iris Pro? Also, will graphics-free options be available in the sub-Enthusiast class? When buying a processor from Intel, the high-end mainstream processors tend to have GT2-class graphics, such as the Intel HD 4600. Enthusiast architectures, such as Haswell-E, cannot be used without discrete graphics -- the extra space is used for more cores, I/O lanes, or other features. As we will discuss later, Broadwell took a step into changing the availability of Iris Pro in the high-end mainstream, but it doesn't seem like Kaby Lake will make any more progress. Also, if I am interpreting the table correctly, Kaby Lake might bring iGPU-less CPUs to LGA 1151.

Keeping Your Core Regular

To the first point, Intel has been on a steady tick-tock cycle since the Pentium 4 architecture reached the 65nm process node, which was a “tick”. The “tock” came from the Conroe/Merom architecture that was branded “Core 2”. This new architecture was a severe departure from the high clock, relatively low IPC design that Netburst was built around, which instantaneously changed the processor landscape from a dominant AMD to an Intel runaway lead.

intel-tick-tock.png

After 65nm and Core 2 started the cycle, every new architecture alternated between shrinking the existing architecture to smaller transistors (tick) and creating a new design on the same fabrication process (tock). Even though Intel has been steadily increasing their R&D budget over time, which is now in the range of $10 to $12 billion USD each year, creating smaller, more intricate designs with new process nodes has been getting harder. For comparison, AMD's total revenue (not just profits) for 2014 was $5.51 billion USD.

Read on to see more about what Kaby Lake could mean for Intel and us.

Report: ASUS STRIX AMD Radeon Fury (Non-X) Card Listings Found

Subject: Graphics Cards | July 3, 2015 - 08:45 PM |
Tagged: strix, rumor, report, Radeon Fury, asus, amd

A report from VideoCardz.com shows three listings for an unreleased ASUS STRIX version of the AMD Radeon Fury (non-X).

ASUS-R9-Fury-DC3-GAMING-Computer-PC-Shop.png

Image credit: VideoCardz

The listings are from European sites, and all three list the same model name: ASUS-STRIX R9FURY-DC3-4G-GAMING. You can find the listing from the above photo here at the German site Computer-PC-Shop.

ASUS-STRIX-R9FURY-DC3-4G-GAMING-Radeon-R9-Fury-4GB-HBM.png

Image credit: VideoCardz

We can probably safely assume that this upcoming air-cooled card will make use of the new DirectCU III cooler introduced with the new STRIX GTX 980 Ti and STRIX R9 390X, and this massive triple-fan cooler should provide an interesting look at what Fury can do without the AIO liquid cooler from the Fury X. Air cooling will of course negate the issue of pump whine that many have complained about with certain Fury X units.

R9390X_STRIX.png

The ASUS STRIX R9 390X Gaming card with DirectCU III cooler

We await offical word on this new GPU, and what price we might expect this particular version to sell for here in the U.S.A.

Source: VideoCardz

$110 Intel Compute Stick With Ubuntu 14.04 LTS Coming Soon

Subject: General Tech | July 7, 2015 - 04:21 PM |
Tagged: Z3735F, ubuntu 14.04, SFF, linux, Intel, compute stick

Intel is giving Linux some love with a new Compute Stick equipped with Ubuntu Linux 14.04 LTS coming out this week for $110. This new model comes with less RAM and intrernal storage along with a $40 price cut versus the previous Compute Stick (which comes with Windows 8.1 With Bing). 

On the outside, the new Linux-equipped Compute Stick (STCK1A8LFC) is identical to the existing SKU (read our review here) with its flash drive form factor, Intel logo, and small vents along the top and sides. Ports on the Intel STCK1A8LFC include one HDMI, one Micro USB port for power, one Micro SD card slot for storage, and a single full size USB 2.0 port for peripherals.

Intel Compute Stick STCK1A8LFC With Ubuntu 14.png

The Compute Stick is powered by an Intel Z3735F processor that is actively cooled by a tiny fan. This chip is a 22nm Bay Trail part with four CPU cores and Intel HD Graphics. The CPU has a base clock of 1.33 GHz and a maximum turbo clockspeed of 1.83 GHz. This SoC is paired with 1GB of DDR3L memory and 8GB of internal flash eMMC storage. There is also an 802.11b/g/n wireless radio with Bluetooth. The table below compares these specifications to the alternative Compute Stick with Windows.

  Compute Stick (Ubuntu) Compute Stick (Windows)
CPU Z3735F Z3735F
RAM 1 GB 2 GB
Storage 8 GB 32 GB
Price $110 $150
Model # STCK1A8LFC STCK1A32WFC

The STCK1A8LFC with Ubuntu 14.04 LTS will be available later this week from all the usual online retailers with an MSRP of $110.

It would have been nice to keep the 2GB of RAM even if Intel could not cut the price as much. There is always Micro SD for more stoage, but the 1GB of RAM is going to be somewhat limiting even for a Linux OS which typically can be made to run much leaner than Windows. It is nice to see Linux getting a design win and being bundled with the portable PC. If you need more RAM from your Compute Stick, you will need to buy the more expensive Windows version – at $150 – and install Linux yourself, however.

Source: Intel

WiFi Password sharing, a little known Windows Phone feature is about to hit the big time

Subject: General Tech | July 1, 2015 - 03:25 PM |
Tagged: Wi-Fi Sense, _optout, windows 10

Wi-Fi Sense has been a feature on phones running Windows 8.1, entering in your password on the phone would allow a computer logged in with the same Microsoft account to connect to your own wireless, with the password stored and encrypted on a Microsoft server.  It looks as though this feature will be available on all Windows 10 devices, sharing your wireless passwords with all of your Outlook, Skype and even Facebook contacts if you enable it.  This is certainly handy for when visiting as you will not need to ask for the wireless password at a friends house but does raise some security concerns.  If you happen to have Outlook contacts on your work machine which are not necessarily co-workers, they would be able to access your corporate network, as unfortunately would their contacts and even worse so could anyone who had compromised any of those accounts or machines.  The password is encrypted and not easy to access directly and the application does seem to limit access to WAN, somehow blocking access to the LAN even with proper credentials.  As The Register rightly points out, if a password is the totality of your access management protocols, you are already doing it wrong but this is something all users should be aware of.

wp_ss_20140602_0009a.png

"A Windows 10 feature, Wi-Fi Sense, smells like a security risk: it shares Wi-Fi passwords with the user's contacts."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

When the going gets tough, the TUF get going to ASUS

Subject: Motherboards | July 2, 2015 - 06:17 PM |
Tagged: Sabertooth X99, asus, tuf

The TUF series of ASUS boards are recognizable thanks to the Thermal Armour which covers the vast majority of the board and are marketed as having mil-spec components to outlast other motherboards using the same chipset.  This board supports quad GPU setups but keep in mind that there is also an M.2 port, that you need a more expensive CPU and the fact that there are only three PCIe 16x 3.0 slots, the other card will be in a PCIe 4x 2.0 slot, leaving a single 1x slot for other cards. The AI Suite III overclocking software is not supported on this TUF board but [H]ard|OCP had great success overclocking manually, some of their reviewers more so than others though.  Check out the full review if you are comparison shopping for an X99 motherboard.

1434180518asAm5ImvY1_1_10_l.jpg

"ASUS’ SABERTOOTH X99 promises premium quality and unmatched stability alongside industry leading fan control. Saberooth motherboards have in the past all been universally excellent and this motherboard is one of the newest in the TUF series. Can ASUS keep that streak going? It's going to be TUF."

Here are some more Motherboard articles from around the web:

Motherboards

Source: [H]ard|OCP