Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

PCPer Live! AMD Radeon Crimson Live Stream and Giveaway!

Subject: Graphics Cards | November 24, 2015 - 09:08 PM |
Tagged: video, radeon software, radeon, live, giveaway, freesync, crimson, contest, amd

UPDATE: Did you miss today's live stream? No worries! You can get the full rundown of the new Radeon Software Crimson Edition driver and get details on new features like FreeSync Low Frame Rate Compensation, DX9 frame pacing, custom resolutions, and more. Check out the video embed below.

It's nearly time for the holidays to begin but that doesn't mean the hardware and software news train comes to a halt! This week we are hosting AMD in the PC Perspective offices for a live stream to discuss the upcoming release of the new AMD Radeon Software Crimson Edition. Earlier in the month we showed you a preview of what changes were coming to the AMD GPU driver and now we are going to not only demo it for you but let the community ask AMD questions directly about it!


And what's a live stream without prizes? AMD has stepped up to the plate to offer up some awesome hardware for those of you that tune in to watch the live stream! 

  • 2 x AMD Radeon R9 Nano 4GB Fiji Graphics Cards
  • 2 x PowerColor PCS+ Radeon R9 380 Graphics Cards




AMD Radeon Software Crimson Live Stream and Giveaway

12pm PT / 3pm ET - November 24th

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

The event will take place Tuesday, November 24th at 12pm PT / 3pm ET at http://www.pcper.com/live. There you’ll be able to catch the live video stream as well as use our chat room to interact with the audience. To win the prizes you will have to be watching the live stream, with exact details of the methodology for handing out the goods coming at the time of the event.

I will be joined by Adrian Costelo, Product Manager for Radeon Software, and Steven Gans, UX Designer for Radeon Software. In short, these are the two people you want to hear from and have answer your questions!

If you have questions, please leave them in the comments below and we'll look through them just before the start of the live stream. Of course you'll be able to tweet us questions @pcper and we'll be keeping an eye on the IRC chat as well for more inquiries. What do you want to know and hear from AMD?

So join us! Set your calendar for Tuesday at 12pm PT / 3pm ET and be here at PC Perspective to catch it. If you are a forgetful type of person, sign up for the PC Perspective Live mailing list that we use exclusively to notify users of upcoming live streaming events including these types of specials and our regular live podcast. I promise, no spam will be had!

Source: PCPer Live!
Manufacturer: AMD

FreeSync and Frame Pacing Get a Boost

Make sure you catch today's live stream we are hosting with AMD to discuss much more about the new Radeon Software Crimson driver. We are giving away four Radeon graphics cards as well!! Find all the information right here.

Earlier this month AMD announced plans to end the life of the Catalyst Control Center application for control of your Radeon GPU, introducing a new brand simply called Radeon Software. The first iteration of this software, Crimson, is being released today and includes some impressive user experience changes that are really worth seeing and, well, experiencing.

Users will no doubt lament the age of the previous Catalyst Control Center; it was slow, clunky and difficult to navigate around. Radeon Software Crimson changes all of this with a new UI, a new backend that allows it to start up almost instantly, as well as a handful of new features that might be a surprise to some of our readers. Here's a quick rundown of what stands out to me:

  • Opens in less than a second in my testing
  • Completely redesigned and modern user interface
  • Faster display initialization
  • New clean install utility (separate download)
  • Per-game Overdrive (overclocking) settings
  • LiquidVR integration
  • FreeSync improvements at low frame rates
  • FreeSync planned for HDMI (though not implemented yet)
  • Frame pacing support in DX9 titles
  • New custom resolution support
  • Desktop-based Virtual Super Resolution
  • Directional scaling for 2K to 4K upscaling (Fiji GPUs only)
  • Shader cache (precompiled) to reduce compiling-induced frame time variance
  • Non-specific DX12 improvements
  • Flip queue size optimizations (frame buffer length) for specific games
  • Wider target range for Frame Rate Target Control


That's quite a list of new features, some of which will be more popular than others, but it looks like there should be something for everyone to love about the new Crimson software package from AMD.

For this story today I wanted to focus on two of the above features that have long been a sticking point for me, and see how well AMD has fixed them with the first release of Radeon Software.

FreeSync: Low Frame Rate Compensation

I might be slightly biased, but I don't think anyone has done a more thorough job of explaining and diving into the differences between AMD FreeSync and NVIDIA G-Sync than the team at PC Perspective. Since day one of the G-Sync variable refresh release we have been following the changes and capabilities of these competing features and writing about what really separates them from a technological point of view, not just pricing and perceived experiences. 

Continue reading our overview of new features in AMD Radeon Software Crimson!!

Software Development Debate: Design vs Profile

Subject: General Tech | November 22, 2015 - 02:08 PM |
Tagged: software development

This is article is a bit different from what we normally post, so be sure to give your opinion in the comments if you want to see more of this (or not). You will most often find these types of software development rants on a programmer's personal blog, but I find them interesting so here we go.


There are two camps in software development regarding optimization, each with diehard advocates. One side argues for software to be strictly designed, with decisions needing to be coherent and performance-minded. The other viewpoint claims that optimization should be done after profiling, because you could spend weeks making a fairly useless chunk of code purr like a kitten, and ignore the turkey that's using 99.99% of your resources.

Both sides can also point to situations that validate their opinion. The latter, “don't premature optimize” crowd can show examples where time is wasted because the engineer didn't look before they leaped. One such story comes from Chandler Carruth of Google. One of his first tasks at the company was to review code from Ken Thompson, a “very senior engineer” who created Unix and defined UTF-8. It solved rule-matching with about a 20-fold increase in performance over what they were currently using. When Chandler went to integrate the fix, his colleague mentioned “Yeah, turns out our use-case overwhelmingly hits a single rule, so I just check it first. It's now 100x faster.”

The other crowd says that, even if you can find exactly where poop stinks, you're still polishing a turd. One issue that is commonly pointed to is garbage collection. In memory-managed languages, this process scans through your application to delete unused chunks. Its goal is to remove memory leaks without users needing to carefully manage allocation themselves. The problem is that it necessarily freezes basically every thread and often takes several frames worth of time to complete. As such, you can either live with the bad user experience in real-time applications, or you can carefully design your application to avoid leaking memory. If you take the time to design and architect, it allows you to either choose a framework without garbage collection, or sometimes reduce / eliminate how often it triggers.

So the argument is over-thinking wasting time versus under-planning painting software into corners. As it should be somewhat obvious, both are correct. It's a bad idea to blindly charge into development, and it's good to think about the consequences of what you're doing. At the same time, what you think means nothing if it differs from what you measure, so you need to back up your thoughts with experimentation.

The challenge is to coast the middle for the benefits of both, without falling into the traps on either side.

Intel to Ship FPGA-Accelerated Xeons in Early 2016

Subject: Processors | November 20, 2015 - 06:21 PM |
Tagged: Intel, xeon, FPGA

Designing integrated circuits, as I've said a few times, is basically a game. You have a blank canvas that you can etch complexity into. The amount of “complexity” depends on your fabrication process, how big your chip is, the intended power, and so forth. Performance depends on how you use the complexity to compute actual tasks. If you know something special about your workload, you can optimize your circuit to do more with less. CPUs are designed to do basically anything, while GPUs assume similar tasks can be run together. If you will only ever run a single program, you can even bake some or all of its source code into hardware called an “application-specific integrated circuit” (ASIC), which is often used for video decoding, rasterizing geometry, and so forth.


This is an old Atom back when Intel was partnered with Altera for custom chips.

FPGAs are circuits that can be baked into a specific application, but can also be reprogrammed later. Changing tasks requires a significant amount of time (sometimes hours) but it is easier than reconfiguring an ASIC, which involves removing it from your system, throwing it in the trash, and printing a new one. FPGAs are not quite as efficient as a dedicated ASIC, but it's about as close as you can get without translating the actual source code directly into a circuit.

Intel, after purchasing FPGA manufacturer, Altera, will integrate their technology into Xeons in Q1 2016. This will be useful to offload specific tasks that dominate a server's total workload. According to PC World, they will be integrated as a two-chip package, where both the CPU and FPGA can access the same cache. I'm not sure what form of heterogeneous memory architecture that Intel is using, but this would be a great example of a part that could benefit from in-place acceleration. You could imagine a simple function being baked into the FPGA to, I don't know, process large videos in very specific ways without expensive copies.

Again, this is not a consumer product, and may never be. Reprogramming an FPGA can take hours, and I can't think of too many situations where consumers will trade off hours of time to switch tasks with high performance. Then again, it just takes one person to think of a great application for it to take off.

Source: PCWorld

Blackberry exposes its Privs and the world ... likes it?

Subject: Mobile | November 20, 2015 - 06:23 PM |
Tagged: blackberry, Priv, Android

Could it truly be enough to add a little Android to your Blackberry to bring them back to some form of popularity?  From what this contributor at The Register has to say it could very well be what the company once known as RIM needed.  The Priv is described as the least irritating Android phone they've ever used, which translates to high praise when you are talking about a Blackberry device.  The sliding keyboard is actually useful, the BlackBerry DTek security app is decent but requires a Google account to be linked to the phone, as do many other apps.  Check out the review to see if this is a berry flavoured Lollipop you might actually want a few licks at.


"Other than that, none of which really counts, I think this might be my least disliked Android phone so far. It’s at least as good as the best Android phones I have used, because it’s the same as all other Androids, just minus the garbage often layered on top. And it’s better, because the BlackBerry stuff layered on top is very far from being garbage."

Here are some more Mobile articles from around the web:


Source: The Register

Intel Launches Knights Landing-based Xeon Phi AIBs

Subject: Processors | November 18, 2015 - 07:34 AM |
Tagged: Xeon Phi, knights landing, Intel

The add-in board version of the Xeon Phi has just launched, which Intel aims at supercomputing audiences. They also announced that this product will be available as a socketed processor that is embedded in, as PC World states, “a limited number of workstations” by the first half of next year. The interesting part about these processors is that they combine a GPU-like architecture with the x86 instruction set.

intel-2015-KNL die.jpg

Image Credit: Intel (Developer Zone)

In the case of next year's socketed Knights Landing CPUs, you can even boot your OS with it (and no other processor installed). It will probably be a little like running a 72-core Atom-based netbook.

To make it a little more clear, Knights Landing is a 72-core, 512-bit processor. You might wonder how that can compete against a modern GPU, which has thousands of cores, but those are not really cores in the CPU sense. GPUs crunch massive amounts of calculations by essentially tying several cores together, and doing other tricks to minimize die area per effective instruction. NVIDIA ties 32 instructions together and pushes them down the silicon. As long as they don't diverge, you can get 32 independent computations for very little die area. AMD packs 64 together.

Knight's Landing does the same. The 512-bit registers can hold 16 single-precision (32-bit) values and operate on them simultaneously.

16 times 72 is 1152. All of a sudden, we're in shader-count territory. This is one of the reasons why they can achieve such high performance with “only” 72 cores, compared to the “thousands” that are present on GPUs. They're actually on a similar scale, just counted differently.

Update: (November 18th @ 1:51 pm EST) I just realized that, while I kept saying "one of the reasons", I never elaborated on the other points. Knights Landing also has four threads per core. So that "72 core" is actually "288 thread", with 512-bit registers that can perform sixteen 32-bit SIMD instructions simultaneously. While hyperthreading is not known to be 100% efficient, you could consider Knights Landing to be a GPU with 4608 shader units. Again, it's not the best way to count it, but it could sort-of work.

So in terms of raw performance, Knights Landing can crunch about 8 TeraFLOPs of single-precision performance or around 3 TeraFLOPs of double-precision, 64-bit performance. This is around 30% faster than the Titan X in single precision, and around twice the performance of Titan Black in double precision. NVIDIA basically removed the FP64 compute units from Maxwell / Titan X, so Knight's Landing is about 16x faster, but that's not really a fair comparison. NVIDIA recommends Kepler for double-precision workloads.

So interestingly, Knights Landing would be a top-tier graphics card (in terms of shading performance) if it was compatible with typical graphics APIs. Of course, it's not, and it will be priced way higher than, for instance, the AMD Radeon Fury X. Knight's Landing isn't available on Intel ARK yet, but previous models are in the $2000 - $4000 range.

Source: PC World

NVIDIA Re-Releases SHIELD Tablet as K1 - Cuts Price to $199

Subject: Mobile | November 18, 2015 - 06:41 PM |
Tagged: tegra k1, tablet, shield tablet k1, shield controller, shield, nvidia, gaming tablet, Android

NVIDIA has released their updated version of the SHIELD tablet with a new name, but very little has changed other than the name (now the SHIELD tablet K1) and the price - now $100 less expensive at $199.99.


The SHIELD tablet K1 (pictured case and controller are not included)

Under the hood the 8-inch Android-powered tablet is identical to its predecessor, with the quad-core Tegra K1 processor with its 192 CUDA core GPU powering the gaming action on the 1920x1200 display. The controller is still a separate $59.99 purchase, but of course this is not required to use the tablet.

Here are full specs from NVIDIA:

  • Processor: NVIDIA Tegra K1 192 core Kepler GPU (2.2 GHz ARM Cortex A15 CPU with 2 GB RAM)
  • Display: 8-inch 1920x1200 multi-touch full-HD display
  • Audio: Front-facing stereo speakers with built-in microphone
  • Storage: 16 GB
  • Wireless: 802.11n 2x2 MIMO 2.4 GHz and 5 GHz Wi-Fi; Bluetooth 4.0 LE, GPS/GLONASS
  • I/O: Mini-HDMI output, Micro-USB 2.0, MicroSD slot, 3.5 mm stereo headphone jack with microphone support
  • Motion Sensors: 3-axis gyro, 3-axis accelerometer, 3-axis compass
  • Cameras: Front, 5MP HDR; Back, 5MP auto-focus HDR
  • Battery: 19.75 Watt Hours
  • Dimensions: Weight, 12.6 oz (356 g); H x W x D: 8.8 in (221 mm) x 5.0 in (126 mm) x 0.36 in (9.2 mm)
  • Operating System: Android Lollipop
  • Gaming Features: SHIELD controller compatible, GeForce NOW cloud gaming service, Console Mode, NVIDIA ShadowPlay
  • Included Apps: Google Play, NVIDIA SHIELD Hub, Fallout Shelter, NVIDIA Dabbler, Squid, Twitch


This update really comes down to price, as NVIDIA is being more aggressive about the adoption of their gaming tablet with the new MSRP. This doesn't come without some concessions, however, as the SHIELD tablet K1 ships without any accessories (no USB cable or charger). It's a move remienscent of Nintendo with the "New 3DS XL", which also shipped without a charger, and the standard micro-USB connection should be readily at hand for most of the target audience.

The question of course must be, is this now a more compelling product at $199? It does make the controller seem a bit more affordable considering the bundle will now run $260 - $40 below the previous tablet-only price. Time will tell (and of course you can let us know in the comments below!).

NVIDIA is selling the SHIELD tablet K1 directly from their web store, and it's already on Amazon for the same $199.99 price.

Source: NVIDIA

Security Professionals Find eDellRoot Superfishy

Subject: Systems | November 23, 2015 - 09:25 PM |
Tagged: dell, superfish, edellroot

The pun was too tempting, but don't take it too seriously even though it's relatively similar. In short, Dell installs a long-lived, root certificate on their machines with a private key that is now compromised (because they didn't exactly protect it too well). This certificate, and the compromised private key, can be used to sign secure connections without needing to be verified by a Certificate Authority. In other words, it adds a huge level of unwarranted trust to phishing and man-in-the-middle attacks.


Dell has not really made any public comment on this issue yet. I don't really count the tweet from Dell Cares, because customer support is a terrible source for basically any breaking news. It's best to wait until Dell brings out an official statement through typical PR channels before assuming what their position is. Regardless of what they say, of course, your security will be heavily reduced until the certificate and eDell plug-in are removed from your device.

I'm really just wondering if Dell will somehow apologize, or stick to their guns.

Source: Duo Security

ASUS Announces New N Series Laptops with Intel Skylake and 4K IPS

Subject: Systems, Mobile | November 22, 2015 - 01:30 AM |
Tagged: Skylake, PCIe SSD, notebook, N752, N552, laptop, ips, Intel Core i7, GTX 960M, asus, 4k

ASUS has added two new laptops to their N series line up premium, entertainment-focused laptops. The new models offer Intel’s 6th-gen (Skylake) Core i7 processors and high resolution IPS displays, as well as fast PCIe storage and discrete NVIDIA graphics.


The new models are the 15.6-inch N552 and 17.3-inch N752, and both sizes offer wide-gamut IPS display options up to 3840x2160 with 100% sRGB coverage. The displays are powered by graphics up to a discrete NVIDIA GeForce GTX 960M. Quad-core Intel Core i7 processors power both models, with a generous 16GB of RAM standard. Storage is provided via PCIe x4 storage with speeds of 1500 MB/s with capacities up to 512 MB, and external connectivity includes a USB 3.1 Gen 2 Type-C port.


While boasting powerful specs these N-series laptops are also geared toward entertainment, with ASUS drawing attention to the sound from their “SonicMaster” audio system, which boasts powerful B&O ICEpower class-D amplification for the laptop’s front-facing speakers. Other features include backlit keys which offer 1.8 mm travel, and aluminum covering the keyboard area and lid.

The new models haven’t shown up on the U.S. product pages just yet, so pricing and availability are not yet known.

Source: TechPowerUp

AMD is the new King of Crimson

Subject: Graphics Cards | November 24, 2015 - 01:36 PM |
Tagged: radeon software, radeon, low frame rate compensation, freesync, frame pacing, crimson, AMD VISION Engine

In case you thought we missed something in our discussion of the new AMD Crimson software you can check out what some of the other websites thought of the new release.  The Tech Report is a good first stop, they used the Fable Legends DX 12 to test the improvements to frametime which will be of interest to those who do not obsess over DX 9 games and their performance.  They also delve a bit more into the interface so you can see what the new screens will look at as well as learning the path that will take you to a familiar settings screen.  Check out their impressions right here.


"AMD's Radeon Software Crimson Edition is the second in a line of major annual graphics driver updates from the company. Crimson also replaces the Catalyst Control Center software with a faster, more refined utility called Radeon Settings. We dug in to see what Crimson has to offer."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Logitech's Artemis Spectrum headset; 7.1 audiophile quality?

Subject: General Tech | November 18, 2015 - 05:28 PM |
Tagged: logitech, G633 Artemis Spectrum, 7.1 headset

Logitech talks big about their G633 Artemis Spectrum gaming headsets, with audiophile-like quality and seven adjustable audio channels along with the good old .1 bass channel.  They do have a history of producing quality audio products and so Techgage set out to determine how well Logitech did on these headsets.  The software allows you, among other things, to choose between DTS Headphone X and Dolby Surround modes, with each channels volume being adjustable in Dolby mode; effectively from what Techgage could hear when gaming.  In the end the $149.99 MSRP and audio quality nowhere near the levels an audiophile would want prevented Techgage from loving the G633 but for atmospheric gaming these are a decent choice for the well off gamer.


"When Logitech announced its Artemis Spectrum gaming headsets, it said that they would deliver “audiophile-like” sound. Now, that’s a lofty promise. The company sent us the wired version, the G633, for us to review. Does it live up to its divine name and ambitious promises, or does it fall short, leaving us mere mortals still hunting for a god-like audio experience?"

Here is some more Tech News from around the web:

Audio Corner

Source: Techgage

Windows 10 Tool Now Reverted to Build 10240

Subject: General Tech | November 23, 2015 - 08:15 PM |
Tagged: windows 10, microsoft

UPDATE (Nov 24th, 8pm ET): As I was informed, both on Twitter and in the comments, the update has been restored. Apparently the issue was that this tool, when upgrading Windows 10 to Windows 10 1511, accidentally reset four privacy settings to default. They also happened to be four of the less-severe ones, such as whether to allow apps to run in the background and whether settings should sync between devices. It has apparently been fixed and the tool will install the latest version of Windows 10 once more.

Source: Ars Technica

Regardless of your opinion about Windows 10, I'm glad that Microsoft has once again provided a way to force a specific version on your device. Their recent statement, telling users that Windows Update will give them the correct build eventually, is not comforting if someone is failing to receive the update. Is it coming? Or did it block for some reason? I also wonder if the 30-day policy would still be enforced, making clean installs that much more annoying. Turns out it was all hypothetical, and Microsoft was planning on reinstating it instantly, though.

This is a bit surprising and disappointing. When the November 2015 update for Windows 10 went live, existing users could upgrade with Windows Update (if it let them) and the rest could force an in-place upgrade from Windows 7, 8.x, and earlier builds of Windows 10 using the tool. The latter method has apparently been reverted to the original Windows 10 build from July 2015.


This image is getting a lot more use than I intended.

Why? Who knows. They are still offering the update through Windows Update, and Microsoft claims that they have no intention of pulling it. This concerns me, because there are a few situations where Windows 10 updates will get stuck, such as if you get it through Windows Update then uninstall it. I have not seen any report cover the official procedure for this issue. Also, I wonder if there's a way to get past Microsoft's 30-day no-update policy.

According to WinBeta, Microsoft's official statement contains the following: “Microsoft has not pulled the Windows 10 November 10 update. The company is rolling out the November update over time – if you don’t see it in Windows Update, you will see it soon.” (Emphasis not mine.)

We'll probably hear more about this as the week goes on.

Source: WinBeta

Podcast #376 - Intel Speed Shift, CPU Coolers from Noctua and DEEPCOOL, Broadwell-E Rumors, and more!

Subject: General Tech | November 19, 2015 - 02:42 PM |
Tagged: podcast, video, noctua, Deepcool, Gamer Storm Gabriel, Intel, speed shift, amd, R9, fury x, trixx, Broadwell-E, kaby lake, nvidia, shield tablet k1, knights landing, asus, chromebit

PC Perspective Podcast #376 - 11/19/2015

Join us this week as we discuss Intel Speed Shift, CPU Coolers from Noctua and DEEPCOOL, Broadwell-E Rumors, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Sebastian Peak

Program length: 1:19:22

  1. Week in Review:
  2. 0:32:10 This episode of PC Perspective Podcast is brought to you by Braintree. Even the best mobile app won’t work without the right payments API. That’s where the Braintree v.0 SDK comes in. One amazingly simple integration gives you every way to pay. Try out the sandbox and see for yourself at braintree­payments.com/pcper
  3. News item of interest:
  4. Hardware/Software Picks of the Week:
  5. Closing/outro

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

Deals of the Day: 960GB SSD for $199, $69 Athlon X4, GTX 970 Price Drop

Subject: General Tech | November 24, 2015 - 02:55 PM |
Tagged: SanDisk Ultra II, MSI GTX 970 Gaming, GTX 970, 960GB SSD

The internet is full of sales this week, and there are some great PC hardware deals out there beginning with the best price per GB on an SSD we've ever seen.


At $199.99 shipped this 960 GB SanDisk Ultra II SSD is a stunning $0.20/GB, and offers good speeds for a SATA III drive with up to 550 MB/s reads and 500 MB/s writes, along with "n-Cache 2.0" which SanDisk explains is "a large, non-volatile write cache (which) consolidates small writes to boost random write performance".

What better to fill up that huge SSD than a library of games, and if you're in the market for a new graphics card to drive them there are some excellent deals out there. A good mid-range GPU option is the oft-maligned NVIDIA GeForce GTX 970, and memory controversy aside it's a very high performer for the money. Lately prices have dropped a bit, and there some great options out there when you factor in rebates.


While you can find some lower-cost GTX 970's out there this is a good deal for one of MSI's overclocked Gaming series cards. And if you're looking for a quad-core processor to help drive a new GPU, how about AMD's ultra-affordable Athlon X4 860k, now under $70!


It's nice to see prices starting to drop on some solid upgrades, and we're currently working on our annual holiday gift guide with more recommendations for a tech-filled holiday season. Stay tuned!

AOC Introduces 24-inch 4K PLS Monitor with HDMI 2.0

Subject: Displays | November 18, 2015 - 10:04 AM |
Tagged: U2477PWQ, PLS, monitor, HDMI 2.0, AOC, 4k monitor, 24-inch display

AOC has announced a new, compact 4K display with a PLS panel, and the U2477PWQ also features HDMI 2.0 input.


With a PLS panel providing a full 178/178 viewing angle the U2477PWQ looks like an attractive alternative to TN designs, if similarly priced. The 16.7 million colors specified indicate the use of an 8-bit panel/processing, so this won't offer the same level of color gradation as a 10-bit IPS (or PLS) panel, though likely not an issue unless this is intended for serious color work. As far as the ergonomics are concerned, the display stand offers full hight/pivot/tilt functionality, and there is also a standard 100 mm VESA mount on the back.

Specifications from AOC:

  • Monitor Size: 23.6 Inch
  • Resolution: 3840x2160@60Hz
  • Response time: 4 ms
  • Panel Type: PLS
  • Viewing Angle: 178/178
  • Colors: 16.7 Million
  • Brightness: 300 cd/m2 (type)
  • Contrast Ratio: 1000:1
  • Dynamic Contrast Ratio: 50M:1
  • HDCP: Compatible    
  • Input: DVI, HDMI 2.0, DisplayPort, D-Sub    
  • Ergonomics: Pivot, Swivel, Tilt -5/+23; Height Adjustment 130mm
  • Other Features:    FlickerFree, Vesa Wallmount 100x100, i-Menu, e-Saver, Screen+
  • Power Source: 100 - 240V 50/60Hz
  • Power Consumption: On 34W; Standby 0.5W; Off: 0.3W


This new display is listed on AOC's European site here, and it appears that the U2477PWQ is not yet available in the United States.

Source: FlatpanelsHD

AMD R9 Fury X Voltage and HBM Unlocked with Sapphire TriXX 5.2.1

Subject: Graphics Cards | November 18, 2015 - 01:22 PM |
Tagged: Sapphire TriXX, R9 Fury X, overclocking, hbm, amd

The new version (5.2.1) of Sapphire's TriXX overclocking utility has been released, and it finally unlocks voltage and HBM overclocking for AMD's R9 Fury X.


(Image credit: Sapphire)

Previously the voltage of the R9 Fury X core was not adjustable, leaving what would seem to be quite a bit of untapped headroom for the cards which shipped with a powerful liquid-cooling solution rated for 500 watts of thermal dissipation. This should allow for much better results than what Ryan was able to achieve when he attempted overclocking for our review of the R9 Fury X in June (without the benefit of voltage adjustments):

"My net result: a clock speed of 1155 MHz rather than 1050 MHz, an increase of 10%. That's a decent overclock for a first attempt with a brand new card and new architecture, but from the way that AMD had built up the "500 watt cooler" and the "375 watts available power" from the dual 8-pin power connectors, I was honestly expecting quite a bit more. Hopefully we'll see some community adjustments, like voltage modifications, that we can mess around with later..."


(Image credit: Sapphire)

Will TriXX v5.2.1 unleash the full potential of the Fury X? We will have to wait for some overclocked benchmark numbers, but having the ability can only be a good thing for enthusiasts.

Source: WCCFtech
Manufacturer: EVGA

Introduction and Features


EVGA recently introduced three new Platinum certified power supplies in their popular SuperNOVA line, the 650P2, 750P2 and 850P2. All three power supplies are 80 Plus Platinum certified for high efficiency and feature all modular cables, high-quality Japanese brand capacitors, and a quiet 140mm cooling fan (with the ability to operate in silent, fan-less mode at low to mid power levels). And in addition to delivering excellent performance with quiet operation, these new power supplies are backed by a 10-year warranty!


EVGA was founded in 1999 with headquarters in Brea, California. They continue to specialize in producing NVIDIA based graphics adapters and Intel based motherboards and keep expanding their PC power supply product line, which currently includes thirty-four models ranging from the high-end 1,600W SuperNOVA T2 to the budget minded EVGA 400W power supply.


In this review we will be taking a detailed look at both the EVGA SuperNOVA 650P2 and 750P2 power supplies. It’s nice when we receive two slightly different units in the same product series to look for consistency during testing.

Here is what EVGA has to say about the new SuperNOVA P2 Platinum PSUs: “The unbeatable performance of the EVGA SuperNOVA P2 power supply line is now available in 850, 750 and 650 watt versions. Based on the award winning P2 power supplies, these units feature 80 Plus Platinum rated efficiency, and clean, continuous power to every component. The ECO Control Fan system offers fan modes to provide absolutely zero fan noise during low to medium load operations. Backed by an award winning 10 year warranty, and 100% Japanese capacitor design, the EVGA SuperNOVA 850, 750 and 650 P2 power supplies offer unbeatable performance and value."


EVGA SuperNOVA 650W P2 and 750W P2 PSU Key Features:

•    Fully modular cables to reduce clutter and improve airflow
•    80 PLUS Platinum certified, with up to 92% efficiency
•    LLC Resonant circuit design for high efficiency
•    Tight voltage regulation, stable power with low AC ripple and noise
•    Highest quality Japanese brand capacitors ensure long-term reliability
•    Quiet 140mm Double ball bearing fan for reliability and quiet operation
•    ECO Intelligent Thermal Control allows silent, fan-less operation at low power
•    NVIDIA SLI & AMD Crossfire Ready
•    Compliance with ErP Lot 6 2013 Requirement
•    Active Power Factor correction (0.99) with Universal AC input
•    Complete Protections: OVP, UVP, OPP, OCP and SCP
•    10-Year warranty and EVGA Customer Support

Please continue reading our review of the EVGA SuperNOVA 650/750 P2 PSUs!!!

What the hell Dell?

Subject: General Tech | November 24, 2015 - 12:42 PM |
Tagged: dell, superfish, security, edellroot

As Scott mentioned yesterday, Dell refused to learn from Lenovo's lesson and repeated the exact same mistake with eDellRoot, a self-signed root CA cert with an unknown purpose.  Unlike SuperFish which was to allow targeted ads to be displayed eDellRoot serves an unclear purpose apart from a mention of Microsoft-like "easier customer support" but it exposes you to the exact same security risks as SuperFish does.  You could remove the cert manually, however as it resides in Dell.Foundation.Agent.Plugins.eDell.dll it will return on next boot and can return on fresh Windows installs via Dell driver updates, something which will be of great concern to their business customers.

Dell has finally responded to the issue, "The recent situation raised is related to an on-the-box support certificate intended to provide a better, faster and easier customer support experience. Unfortunately, the certificate introduced an unintended security vulnerability." and provided a process to remove the certificate from the machine permanently in this Word Document.  You can check for the presence of the cert on your machine in those two links. 

However the best was yet to come as researchers have found a second cert as well as an expired Atheros Authenticode cert for BlueTooth and private key on a limited amount of new Dell computers as well.  As Dell made no mention of these additional certificates in their statement to the press it is hard to give them the benefit of the doubt.  The Bluetooth cert will not make you vulnerable to a man in the middle attack however the second cert is as dangerous as eDellRoot and can be used to snoop on encrypted communications.  The second cert was found on a SCADA machine which is, as they say, a bad thing. 

We await Dell's response to the second discovery as well as further research to determine how widespread the new certs actually are.  So far Dell XPS 15 laptops, M4800 workstations, and Inspiron desktops and laptops have been found to contain these security issues.  The chances of you falling victim to a man in the middle attack thanks to these security vulnerabilities are slim but not zero so be aware of them and keep your eyes out for them on your systems.  With Lenovo and Dell both being caught, it will be interesting to see if HP and other large vendors will learn this lesson or if it will take a third company being caught exposing their customers to unnecessary risks.


"A second root certificate and private key, similar to eDellRoot along with an expired Atheros Authenticode cert and private key used to sign Bluetooth drivers has been found on a Dell Inspiron laptop. The impact of these two certs is limited compared to the original eDellRoot cert."

Here is some more Tech News from around the web:

Tech Talk

Source: Slashdot

The R9 380X arrives

Subject: Graphics Cards | November 19, 2015 - 01:37 PM |
Tagged: asus, strix, Radeon R9 380X, tonga

The full serving of Tonga in the AMD Radeon R9 380X has 32 compute units, 2048 stream processors, 32 ROPs and 128 texture units which compares favourably to the 23CUs, 1792 stream processors, 32 ROPs and 112 texture units of the existing R9 380.  Memory bandwidth and amount is unchanged, 182GB/sec of memory bandwidth at the stock speed of 5.7GHz effective and the GPU clock remains around 970MHz as well.  The MSRP is to be $230 for the base model.

With the specifications out of the way, the next question to answer is how it fares against the direct competition, the GTX 960 and 970.  That is where this review from [H]ard|OCP comes in, with a look at the ASUS STRIX R9 380X DirectCU II OC, running 1030MHz default and 1050MHz at the push of a button.  Their tests at 1440p were a little disappointing, the card did not perform well until advanced graphics settings were reduced but at 1080p they saw great performance with all the bells and whistles turned up.  The pricing will be key to this product, if sellers can keep it at or below MSRP it is a better deal than the GTX 970 but if the prices creep closer then the 970 is the better value.


"AMD has let loose the new AMD Radeon R9 380X GPU, today we evaluate the ASUS STRIX R9 380X OC video card and find out how it compares to a 4GB GeForce GTX 960 and GeForce GTX 970 for a wide picture of where performance lies at 1440p or where it does not at 1440p considering your viewpoint."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Should you fear SilverPush?

Subject: General Tech | November 20, 2015 - 02:22 PM |
Tagged: security, silverpush, fud

SilverPush has been around for a while but was recently reverse-engineered so that it could be investigated by anyone with an interest in their phones security.  It is software that is often bundled in advertisements or streamed media that takes advantage of your phones the far greater range of audio sensitivity and the fact that you can communicate information via audio signals.  This could allow an app to communicate with your phone without your knowledge, to collect data from your phone or even to provide contextual ads on your phone.

However as you can see from the list of apps which The Register links to, there is not much likelihood that you have an app which has SilverPush enabled installed on your phone and that is the real key.  If you do not have an app which is listening for audio signals on those frequencies then you will not suffer the effects of SilverPush.  The moral of the story is that your phones security starts with you, if you download random free apps and allow them full access to your phone then you should not be surprised by this sort of thing.


"SilverPush's software kit can be baked into apps, and is designed to pick up near-ultrasonic sounds embedded in, say, a TV, radio or web browser advert. These signals, in the range of 18kHz to 19.95kHz, are too high pitched for most humans to hear, but can be decoded by software."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register