Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

GPU Market sees 20-point swing in 2014: NVIDIA gains, AMD falls

Subject: Graphics Cards | February 21, 2015 - 12:18 PM |
Tagged: radeon, nvidia, marketshare, market share, geforce, amd

One of the perennial firms that measures GPU market share, Jon Peddie Research, has come out with a report on Q4 of 2014 this weekend and the results are eye opening. According to the data, NVIDIA and AMD each took dramatic swings from Q4 of 2013 to Q4 of 2014.

  Q4 2014 Q3 2014 Q4 2013 Year-to-year Change
AMD 24.0% 28.4% 35.0% -11.0%
Matrox 0.00% 0.10% 0.10% -0.1%
NVIDIA 76.0% 71.5% 64.9% +11.1%
S3 0.00% 0.00% 0.00% +0.0%

Data source: Jon Peddie Research

Here is the JPR commentary to start us out:

JPR's AIB Report tracks computer add-in graphics boards, which carry discrete graphics chips. AIBs used in desktop PCs, workstations, servers, and other devices such as scientific instruments. They are sold directly to customers as aftermarket products, or are factory installed. In all cases, AIBs represent the higher end of the graphics industry using discrete chips and private high-speed memory, as compared to the integrated GPUs in CPUs that share slower system memory.

The news was encouraging and seasonally understandable, quarter-to-quarter, the market decreased -0.68% (compared to the desktop PC market, which decreased 3.53%).

On a year-to-year basis, we found that total AIB shipments during the quarter fell -17.52% , which is more than desktop PCs, which fell -0.72%.

However, in spite of the overall decline, somewhat due to tablets and embedded graphics, the PC gaming momentum continues to build and is the bright spot in the AIB market.

icon.jpg

NVIDIA's Maxwell GPU

The overall PC desktop market increased quarter-to-quarter including double-attach-the adding of a second (or third) AIB to a system with integrated processor graphics-and to a lesser extent, dual AIBs in performance desktop machines using either AMD's Crossfire or Nvidia's SLI technology.

The attach rate of AIBs to desktop PCs has declined from a high of 63% in Q1 2008 to 36% this quarter.

The year to year change that JPR is reporting is substantial and shows a 20+ point change in market share in favor of NVIDIA over AMD. According to this data, AMD's market share has now dropped from 35% at the end of 2013 to just 24% at the end of 2014. Meanwhile, NVIDIA continues to truck forward, going from 64.9% at the end of 2013 to 76% at the end of 2014.

08.jpg

The Radeon R9 285 release didn't have the impact AMD had hoped

Clearly the release of NVIDIA's Maxwell GPUs, the GeForce GTX 750 Ti, GTX 970 and GTX 980 have impacted the market even more than we initially expected. In recent weeks the GTX 970 has been getting a lot of negative press with the memory issue and I will be curious to see what effect this has on sales in the near future. But the 12 month swing that you see in the table above is the likely cause for the sudden departure of John Byrne, Collette LaForce and Raj Naik.

AMD has good products, even better pricing and a team of PR and marketing folks that are talented and aggressive. So how can the company recover from this? Products, people; new products. Will the rumors circling around the Radeon R9 390X develop into such a product?

Hopefully 2015 will provide it.

Samsung Promises Another Fix for 840 EVO Slow Down Issue

Subject: Storage | February 20, 2015 - 06:21 PM |
Tagged: tlc, ssd, Samsung, 840 evo

Some of you may have been following our coverage of the Samsung 840 EVO slow down issue. We first reported on this issue last September, and Samsung issued a fix a couple of months later. This tool was effective in bringing EVOs back up to speed, but some started reporting their drives were still slowing down. Since our January follow up, we have been coordinating with Samsung on a possible fix. We actually sent one of our samples off to them for analysis, and have just received this statement: 
 
In October, Samsung released a tool to address a slowdown in 840 EVO Sequential Read speeds reported by a small number of users after not using their drive for an extended period of time. This tool effectively and immediately returned the drive’s performance to normal levels. We understand that some users are experiencing the slowdown again. While we continue to look into the issue, Samsung will release an updated version of the Samsung SSD Magician software in March that will include a performance restoration tool.
 
image.jpg
 
A look at the reduced read speeds of stale data on an 840 EVO which had the original fix applied. Unpatched drives were slowing much further (50-100 MB/s).
 
So it appears that Samsung is still looking into the issue, but will update their Magician software to periodically refresh stale data until they can work out a more permanent fix that would correct all affected 840 EVOs. We have not heard anything about the other TLC models which have been reported to see this same sort of slow down, but we will keep you posted as this situation develops further.

Windows Update Installs GeForce 349.65 with WDDM 2.0

Subject: General Tech, Graphics Cards | February 21, 2015 - 04:23 PM |
Tagged: wddm 2.0, nvidia, geforce 349.65, geforce, dx12

Update 2: Outside sources have confirmed to PC Perspective that this driver contains DirectX 12 as well as WDDM 2.0. They also claim that Intel and AMD have DirectX 12 drivers available through Windows Update as well. After enabling iGPU graphics on my i7-4790K, the Intel HD 4600 received a driver update, which also reports as WDDM 2.0 in DXDIAG. I do not have a compatible AMD GPU to test against (just a couple of old Windows 7 laptops) but the source is probably right and some AMD GPUs will be updated to DX12 too.

So it turns out that if your motherboard dies during a Windows Update reboot, then you are going to be spending several hours reinstalling software and patches, but that is not important. What is interesting is the installed version number for NVIDIA's GeForce Drivers when Windows Update was finished with its patching: 349.65. These are not available on NVIDIA's website, and the Driver Model reports WDDM 2.0.

nvidia-34965-driver.png

It looks like Microsoft pushed out NVIDIA's DirectX 12 drivers through Windows Update. Update 1 Pt. 1: The "Runtime" reporting 11.0 is confusing though, perhaps this is just DX11 with WDDM 2.0?

nvidia-34965-dxdiag.png

I am hearing online that these drivers support the GeForce 600 series and later GPUs, and that there are later, non-public drivers available (such as 349.72 whose release notes were leaked online). NVIDIA has already announced that DirectX 12 will be supported on GeForce 400-series and later graphics cards, so Fermi drivers will be coming at some point. For now, it's apparently Kepler-and-later, though.

So with OS support and, now, released graphics drivers, all that we are waiting on is software and an SDK (plus any NDAs that may still be in effect). With Game Developers Conference (GDC 2015) coming up in a little over a week, I expect that we will get each of these very soon.

Update 1 Pt. 2: I should note that the release notes for 349.72 specifically mention DirectX 12. As mentioned above, is possible that 349.65 contains just WDDM 2.0 and not DX12, but it contains at least WDDM 2.0.

Author:
Subject: Processors
Manufacturer: AMD

AMD Details Carrizo Further

Some months back AMD introduced us to their “Carrizo” product.  Details were slim, but we learned that this would be another 28 nm part that has improved power efficiency over its predecessor.  It would be based on the new “Excavator” core that will be the final implementation of the Bulldozer architecture.  The graphics will be based on the latest iteration of the GCN architecture as well.  Carrizo would be a true SOC in that it integrates the southbridge controller.  The final piece of information that we received was that it would be interchangeable with the Carrizo-L SOC, which is a extremely low power APU based on the Puma+ cores.

car_01.jpg

A few months later we were invited by AMD to their CES meeting rooms to see early Carrizo samples in action.  These products were running a variety of applications very smoothly, but we were not informed of speeds and actual power draw.  All that we knew is that Carrizo was working and able to run pretty significant workloads like high quality 4K video playback.  Details were yet again very scarce other than the expected timeline of release, the TDP ratings of these future parts, and how it was going to be a significant jump in energy efficiency over the previous Kaveri based APUs.

AMD is presenting more information on Carrizo at the ISSCC 2015 conference.  This information dives a little deeper into how AMD has made the APU smaller, more power efficient, and faster overall than the previous 15 watt to 35 watt APUs based on Kaveri.  AMD claims that they have a product that will increase power efficiency in a way not ever seen before for the company.  This is particularly important considering that Carrizo is still a 28 nm product.

Click here to read more about AMD's ISSCC presentation on Carrizo!

NVIDIA Faces Class Action Lawsuit for the GeForce GTX 970

Subject: Graphics Cards | February 23, 2015 - 04:12 PM |
Tagged: nvidia, geforce, GTX 970

So apparently NVIDIA and a single AIB partner, Gigabyte, are facing a class action lawsuit because of the GeForce GTX 970 4GB controversy. I am not sure why they singled out Gigabyte, but I guess that is the way things go in the legal world. Unlucky for them, and seemingly lucky for the rest.

nvidia-970-architecture.jpg

For those who are unaware, the controversy is based on NVIDIA claiming that the GeForce GTX 970 has 4GB of RAM, 64 ROPs, and 2048 KB of L2 Cache. In actuality, it has 56 ROPs and 1792KB of L2 Cache. The main talking point is that the RAM is segmented into two partitions, one that is 3.5GB and another that is 0.5GB. All 4GB are present on the card though, and accessible (unlike the disable L2 Cache and ROPs). Then again, I cannot see an instance in that class action lawsuit's exhibits which claim an incorrect number of ROPs or amount of L2 Cache.

Again, the benchmarks that you saw when the GeForce GTX 970 launched are still valid. Since the issue came up, Ryan has also tried various configurations of games in single- and multi-GPU systems to find conditions that would make the issue appear.

Source: Court Filing
Author:
Subject: Editorial
Manufacturer: Bohemia Interactive

Project Lead: Joris-Jan van ‘t Land

Thanks to Ian Comings, guest writer from the PC Perspective Forums who conducted the interview of Bohemia Interactive's Joris-Jan van ‘t Land. If you are interested in learning more about ArmA 3 and hanging out with some PC gamers to play it, check out the PC Perspective Gaming Forum!

I recently got the chance to send some questions to Bohemia Interactive, a computer game development company based out of Prague, Czech Republic, and a member of IDEA Games. Bohemia Interactive was founded in 1999 by CEO Marek Španěl, and it is best known for PC gaming gems like Operation Flashpoint: Cold War Crisis, The ArmA series, Take On Helicopters, and DayZ. The questions are answered by ArmA 3's Project Lead: Joris-Jan van ‘t Land.

PC Perspective: How long have you been at Bohemia Interactive?

VAN ‘T LAND: All in all, about 14 years now.

PC Perspective: What inspired you to become a Project Lead at Bohemia Interactive?

VAN ‘T LAND: During high school, it was pretty clear to me that I wanted to work in game development, and just before graduation, a friend and I saw a first preview for Operation Flashpoint: Cold War Crisis in a magazine. It immediately looked amazing to us; we were drawn to the freedom and diversity it promised and the military theme. After helping run a fan website (Operation Flashpoint Network) for a while, I started to assist with part-time external design work on the game (scripting and scenario editing). From that point, I basically grew naturally into this role at Bohemia Interactive.

arma3_screenshot_02.jpg

PC Perspective: What part of working at Bohemia Interactive do you find most satisfying? What do you find most challenging?

VAN ‘T LAND: The amount of freedom and autonomy is very satisfying. If you can demonstrate skills in some area, you're welcome to come up with random ideas and roll with them. Some of those ideas can result in official releases, such as Arma 3 Zeus. Another rewarding aspect is the near real-time connection to those people who are playing the game. Our daily Dev-Branch release means the work I do on Monday is live on Tuesday. Our own ambitions, on the other hand, can sometimes result in some challenges. We want to do a lot and incorporate every aspect of combat in Arma, but we're still a relatively small team. This can mean we bite off more than we can deliver at an acceptable level of quality.

PC Perspective: What are some of the problems that have plagued your team, and how have they been overcome?

VAN ‘T LAND: One key problem for us was that we had no real experience with developing a game in more than one physical location. For Arma 3, our team was split over two main offices, which caused quite a few headaches in terms of communication and data synchronization. We've since had more key team members travel between the offices more frequently and improved our various virtual communication methods. A lot of work has been done to try to ensure that both offices have the latest version of the game at any given time. That is not always easy when your bandwidth is limited and games are getting bigger and bigger.

Continue reading our interview with Bohemia Interactive!!

Windows 10 Technical Preview Build 10022 Spotted

Subject: General Tech | February 26, 2015 - 07:00 AM |
Tagged: windows 10, windows, microsoft

WZor, a group in Russia that somehow acquires many Windows leaks, has just published screenshots of Windows 10 Build 10022 and Windows Server Build 9926. As far as we can tell, not much has changed. We see neither an upgraded Cortana nor a look at the Spartan browser. The build is not labeled “Microsoft Confidential” though, which makes people believe that it is (or was) intended for public release -- maybe as early as this week.

microsoft-windows10-10022-leak.jpg

Image Credit: WZor Twitter

Honestly, I do not see anything different from the provided screenshots apart from the incremented version number. It is possible that this build addresses back-end issues, leaving the major new features for BUILD in late April. Leaked notes (also by WZor) for build 10014, called an “Early Partner Drop”, suggest that version was designed for hardware and software vendors. Perhaps the upcoming preview build is designed to give a platform for third-parties to develop updates ahead of Microsoft releasing the next (or second-next) big build?

Either way, it seems like we will get it very soon.

Source: WZor

You have to pay to play, Gigabyte's overclockable GTX 980 G1 GAMING

Subject: Graphics Cards | February 20, 2015 - 02:08 PM |
Tagged: gigabyte, nvidia, GTX 980 G1 GAMING, windforce, maxwell, factory overclocked

If you want the same amount of Maxwell Streaming Multiprocessors and ROP units as a GTX 980 as well as that last 500MB of RAM to run at full speed then you will need to pay for a GTX 980.  One choice is Gigabyte's GTX 980 G1 Gaming which will cost you $580, around $240 more than a GTX 970 but the premium can be worth it if you need the power.  [H]ard|OCP took the already overclocked card from a boost GPU frequency of 1329MHz and RAM of 7GHz all the way to a boost of 1513MHz with RAM topping out at 8.11GHz.  That overclock had a noticeable effect on performance and helped the card garner an Editors Choice award.  See it in action here.

1424030496NnqLS3kD4X_1_1.jpg

"Today we have the GIGABYTE GTX 980 G1 GAMING, which features the WINDFORCE 600W cooling system and a high factory overclock. We will make comparisons to the competition, find out how fast it is compared to a GTX 980, and you won't believe the overclock we achieved! We will make both performance and price comparisons."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP
Author:
Manufacturer: ASUS

Technology Background

Just over a week or so ago Allyn spent some time with the MSI X99A Gaming 9 ACK motherboard, a fact that might seem a little odd to our frequent readers. Why would our storage editor be focusing on a motherboard? USB 3.1 of course! When we visited MSI at CES in January they were the first company to show working USB 3.1 hardware and performance numbers that we were able duplicate in our testing when MSI sent us similar hardware.

150213-160838.jpg

But ASUS is in this game as well, preparing its product lines with USB 3.1 support courtesy of the same ASMedia controller we looked at before. ASUS has a new revision of several motherboards planned with integrated on-board USB 3.1 but is also going to be releasing an add-in card with USB 3.1 support for existing systems.

Today we are going to test that add-in card to measure ASUS' implementation of USB 3.1 and see how it stacks up to what MSI had to offer and what improvements and changes you can expect from USB 3.0.

USB 3.1 Technology Background

Despite the simple point denomination change in USB 3.1, also known as SuperSpeed+, the technological and speed differences in the newest revision of USB are substantial. Allyn did a good job of summarizing the changes that include a 10 Gbps link interface and a dramatic drop in encoding overhead that enables peak theoretical performance improvements of 2.44x compared to USB 3.0.

120606_lecroy_4-.jpg

USB 3.1 is rated at 10 Gbps, twice that of USB 3.0. The little-reported-on nugget of info from the USB 3.1 specification relates to how they classify the raw vs. expected speeds. Taking USB 3.0 as an example, Superspeed can handle a raw 5Gbps data rate, but after subtracting out the overhead (packet framing, flow control, etc), you are left with ~450MB/s of real throughput. Superspeed+ upgrades the bit encoding type from 8b/10b (80% efficient) to 128b/132b (97% efficient) *in addition to* the doubling of raw data rate. This means that even after accounting for overhead, Superspeed+’s best case throughput should work out to ~1.1GB/s. That’s not a 2x speed improvement – it is actually 2.44x of USB 3.0 speed. Superspeed+ alright!

Continue reading our preview of USB 3.1 Performance on ASUS hardware!

Imagination Launches PowerVR GT7900, "Super-GPU" Targeting Consoles

Subject: Graphics Cards, Mobile | February 26, 2015 - 02:15 PM |
Tagged: super-gpu, PowerVR, Imagination Technologies, gt7900

As a preview to announcements and releases being made at both Mobile World Congress (MWC) and the Game Developers Summit (GDC) next week, Imagination Technologies took the wraps off of a new graphics product they are calling a "super-GPU". The PowerVR GT7900 is the new flagship GPU as a part of its Series7XT family that is targeting a growing category called "affordable game consoles." Think about the Android-powered set-top devices like the Ouya or maybe Amazon's Kindle TV.

gt7900-1.png

PowerVR breaks up its GPU designs into unified shading clusters (USCs) and the GT7900 has 16 of them for a total of 512 ALU cores. Imagination has previously posted a great overview of its USC architecture design and how you can compare its designs to other GPUs on the market. Imagination wants to claim that the GT7900 will offer "PC-class gaming experiences" though that is as ambiguous as the idea of a work load of a "console-level game." But with rated peak performance levels hitting over 800 GFLOPS in FP32 and 1.6 TFLOPS in FP16 (half-precision) this GPU does have significant theoretical capability.

  PowerVR GT7900 Tegra X1
Vendor Imagination Technologies NVIDIA
FP32 ALUs 512 256
FP32 GFLOPS 800 512
FP16 GFLOPS 1600 1024
GPU Clock 800 MHz 1000 MHz
Process Tech 16nm FinFET+ 20nm TSMC

Imagination also believes that PowerVR offers a larger portion of its peak performance for a longer period of time than the competition thanks to the tile-based deferred rendering (TBDR) approach that has been "refined over the years to deliver unmatched efficiency."

gt7900-2.png

The FP16 performance number listed above is useful as an extreme power savings option where the half-precision compute operates in a much more efficient manner. A fair concern is how many applications, GPGPU or gaming, actually utilize the FP16 data type but having support for it in the GT7900 allows developers to target it.

Other key features of the GT7900 include support for OpenGL ES 3.1 + AEP (Android Extension Pack), hardware tessellation and ASTC LDR and HDR texture compression standards. The GPU also can run in a multi-domain virtualization mode that would allow multiple operating systems to run in parallel on a single platform.

gt7900-3.png

Imagination believes that this generation of PowerVR will "usher a new era of console-like gaming experiences" and will showcase a new demo at GDC called Dwarf Hall.

I'll be at GDC next week and have already setup a meeting with Imagination to talk about the GT7900 so I can have some hands on experiences to report back with soon. I am continually curious about the market for these types of high-end "mobile" GPUs with the limited market that the Android console market currently addresses. Imagination does claim that the GT7900 is beating products with performance levels as high as the GeForce GT 730M discrete GPU - no small feat.

Author:
Manufacturer: Intel

Intel Pushes Broadwell to the Next Unit of Computing

Intel continues to invest a significant amount of money into this small form factor product dubbed the Next Unit of Computing, or NUC. When it was initially released in December of 2012, the NUC was built as an evolutionary step of the desktop PC, part of a move for Intel to find new and unique form factors that its processors can exist in. With a 4" x 4" motherboard design the NUC is certainly a differentiating design and several of Intel's partners have adopted it for products of their: Gigabyte's BRIX line being the most relevant. 

But Intel's development team continues to push the NUC platform forward and today we are evaluating the most recent iteration. The Intel NUC5i5RYK is based on the latest 14nm Broadwell processor and offers improved CPU performance, a higher speed GPU and lower power consumption. All of this is packed into a smaller package than any previous NUC on the market and the result is both impressive and totally expected.

A Walk Around the NUC

To most poeple the latest Intel NUC will look very similar to the previous models based on Ivy Bridge and Haswell. You'd be right of course - the fundamental design is unchanged. But Intel continues to push forward in small ways, nipping and tucking away. But the NUC is still just a box. An incredibly small one with a lot of hardware crammed into it, but a box none the less.

IMG_1619.jpg

While I can appreciate the details including the black and silver colors and rounded edges, I think that Intel needs to find a way to add some more excitement into the NUC product line going forward. Admittedly, it is hard to inovate in that directions with a focus on size and compression.

Continue reading our review of the Intel NUC NUC5i5RYK SFF!!

Seemingly Out of Spec AC Cables Could Be a Fire Hazard

Subject: General Tech | February 23, 2015 - 03:31 PM |
Tagged: Vantec, c13

I say “seemingly out of spec” because I am not an electrician, and this requires more understanding of wire classifications than I possess. Regardless, we found a story a little while ago about devices that ship with power cables that are labeled for voltages and amperages that are significantly lower than what they are capable of carrying.

vantec-my-far.jpg

My cable

The minimum requirement for cables with a C13 connector is American 18 gauge (AWG), and they must be able to carry 10 amps. I own the device from the blog posting, like many others at PC Perspective. Again, the device itself (minus the cord that plugs it into the wall) is perfectly fine. The allegation is that the power cord (that goes between the wall and the transformer power brick) cannot carry its full, labeled wattage. The head claims that it can carry 250V at 10A, which is 2500W.

vantec-my-close.jpg

My cable, close up.

We cut open the insides of the cable to see what gauge wire was used, and we were able to remove the insulation with an 18 gauge wire stripper. This is where my lack of applied electrical skills fail me. The power cable feels as flimsy as a quarter-inch audio cable, but I am not qualified to measure the actual internal wires' thickness. It might meet the minimum (18 AWG) requirements, or it might just be thick insulation. I wouldn't trust it, especially not at hundreds or thousands of watts. The blog post author apparently tested their own cable under load, and they claim that it started to melt at 2.6A 123V (320W).

VantecPowerCableOfDeath-Conductors.jpg

The blog author's wire vs a standard cable's wire. It's hard to tell how thin the Vantec one is, because the standard cable was twisted.

Image Credit: Fry's Acid Test

Now, to power a single hard drive and USB controller, you are not going to be drawing those hundreds or thousands of watts from the wall. The main concern is if you swap cables around with other devices. For instance, if that cable would be attached to a high-end gaming desktop, then it could easily see wattages in that range that are sustained for most of a play session, or even higher.

So I guess the takeaway from this is do not trust every power cables that you receive. Make sure your high-power devices are using the cable that came with them, or one from a vendor that you trust. Just because it says it can handle any given load, does not mean that it can.

Intel Revamps Atom Branding, Next Generation Atoms Will Come in x3, x5, and x7 Tiers

Subject: General Tech | February 26, 2015 - 02:02 AM |
Tagged: SoFIA, moorefield, Intel, Cherry Trail, branding, atom

Intel is updating its Atom processor branding to better communicate the expected performance and experience customers can expect from their Intel powered mobile device. In fact, the new branding specifies three tiers. Atom processors will soon come in Atom x3, x5, and x7 flavors. This branding scheme is similar to the Core processor branding using the i3, i5, and i7 labels.

The Atom x3, x5, and x7 chips are low power, efficient processors for battery powered devices and sit below the Core M series which in turn are below the Core i3, i5, and i7 processors. The following infographic shows off the new branding though Intel does not reveal any specific details about these new Atom chips (we will hopefully know more after Mobile World Congress). Of course, Atom x3 chips will reside in smartphones with x5 and x7 chips powering tablets and budget convertibles. The x7 brand represents the flagship processors of the Atom line.

The new branding will begin with the next generation of Atom chips which should include Cherry Trail, the 14nm successor to Bay Trail featuring four x86 Airmont cores and Gen 8 Intel graphics. Cherry Trail (Cherryview SoC) will be used in all manner of mobile devices from entry level 8"+ tablets to larger notebooks and convertibles. It appears that Intel will use Moorefield (a quad core 14nm refresh of Merrifield) through 2015 for smartphones though road maps seem to indicate that Intel's budget SoFIA SoC will also launch this year. SoFIA and Moorefield processors should fall under the Atom x3 brand with the higher powered and higher clocked Cherry Trail chips will use the Atom x5 and x7 monikers.

What are your thoughts on Intel's new Atom x3/x5/x7 brands?

Source: Intel

Roll over Superfish, PrivDog is just as bad and comes from Comodo

Subject: General Tech | February 25, 2015 - 12:36 PM |
Tagged: fud, Comodo, SSL, security, PrivDog, idiots

This has been a bad week for the secure socket layer and the news just keeps getting worse.  Comodo provides around one out of every three SSL certs currently in use as they have, until now, had a stirling reputation and were a trusted provider.  It turns out that this reputation may not be deserved seeing as how their Internet Security 2014 product ships with an application called Adtrustmedia PrivDog, which is enabled by default.  Not only does this app install a custom root CA certificate which intercepts connections to websites to be able to insert customized ads like SuperFish does it can also turn invalid HTTPS certificates into valid ones.  That means that an attacker can use PrivDog to spoof your banks SSL cert, redirect you to a fake page and grab your credentials, while all the time your browser reports a valid and secure connection to the site. 

The only good news from The Register's article is that this specific vulnerability is only present in PrivDog versions 3.0.96.0 and 3.0.97.0 and so has limited distribution.  The fact that this indicates the entire SSL certificate model is broken and even those who create the certs to assure your security feel that inserting a man in the middle attack into their software does not contravene their entire reason for existing is incredibly depressing.

picarddoublefacepalm.jpg

"The US Department of Homeland Security's cyber-cops have slapped down PrivDog, an SSL tampering tool backed by, er, SSL certificate flogger Comodo.

Comodo, a global SSL authority, boasts a third of the HTTPS cert market, and is already in hot water for shipping PrivDog."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

EVGA would like to give you a GTX 960 SSC and a Z97 FTW motherboard

Subject: Editorial, General Tech, Graphics Cards, Motherboards | February 20, 2015 - 05:10 PM |
Tagged: z97, gtx 960, giveaway, evga, contest

I know, the nerve of some people. Jacob from EVGA emails me this week, complaining about how he has this graphics card and motherboard just sitting in his cubicle taking up space and "why won't I just give it away already!?"

Fine. I'll do it. For science.

So let's make this simple shall we? EVGA wants to get rid of some kick-ass gaming hardware and you want to win it. Why muddle up a good thing?

The Prizes

  • EVGA GeForce GTX 960 SSC
     
    • The EVGA GeForce GTX 960 delivers incredible performance, power efficiency, and gaming technologies that only NVIDIA Maxwell technology can offer. This is the perfect upgrade, offering 60% faster performance and twice the power efficiency of previous-generation cards*. Plus, it features VXGI for realistic lighting, support for smooth, tear-free NVIDIA G-SYNC technology, and Dynamic Super Resolution for 4K-quality gaming on 1080P displays.
       
    • The new EVGA ACX 2.0+ cooler brings new features to the award winning EVGA ACX 2.0 cooling technology. A Memory MOSFET Cooling Plate (MMCP) reduces MOSFET temperatures up to 11°C, and optimized Straight Heat Pipes (SHP) reduce GPU temperature by an additional 5°C. ACX 2.0+ coolers also feature optimized Swept fan blades, double ball bearings and an extreme low power motor, delivering more air flow with less power, unlocking additional power for the GPU.

evgacontest1.jpg

  • EVGA Z97 FTW Motherboard
     
    • Welcome to a new class of high performance motherboards with the EVGA Z97 lineup. These platforms offer a return to greatness with a new GUI BIOS interface, reimagined power VRM that focuses on efficiency, and are loaded with features such as Intel® Gigabit LAN, Native SATA 6G/USB 3.0 and more.
       
    • Engineered for the performance users with excellent overclocking features. Includes a GUI BIOS that is focused on functionality, new software interface for overclocking in the O.S., high quality components, M.2 storage option and more.

evgacontest2.jpg

The Process (aka how do you win?)

So even though I'm doing all the work getting this hardware out of Jacob's busy hands and to our readers...you do have to do a couple of things to win the hardware as well. 

  1. Fill out the questionnaire below.
     
  2. Enter the "secret phrase" from tonight's 337th episode of the PC Perspective Podcast. We'll be live streaming at 10pm ET / 7pm PT or you can wait for the downloadable version at http://www.pcper.com/podcast or the video version on our PC Perspective YouTube channel

The contest will run for one week so you will have more than enough time to listen to or watch the podcast and get the super-secret answer. We'll ship to anywhere in the world and one person will win both fantastic prizes! Once the contest closes (Wednesday, February 25th at 12pm ET) we'll randomly draw a winner from the form below that got the correct answer!

A HUGE thanks goes to our friends at EVGA for supplying the hardware for our giveaway. Good luck!

Source: EVGA

Dell's Venue 8 7000 continues to impress

Subject: Mobile | February 25, 2015 - 04:46 PM |
Tagged: z3580, venue 8 7000, venue, tablet, silvermont, moorefield, Intel, dell, atom z3580, Android

Dell's Venue 8 7000 tablet sports an 8.4" 2560x1600 OLED display and is powered by the Moorefield based Atom Z3580 SOC, 2GB LPDDR3-1600 with 16GB internal of internal storage with up to a 512GB Micro SD card supported.  Even more impressive is that The Tech Report had no issues installing apps or moving files to the SD card with ES File Explorer, unlike many Android devices that need certain programs to reside on the internal storage media.   Like Ryan, they had a lot of fun with the RealSense Camera and are looking forward to the upgrade to Lollipop support.  Check out The Tech Report's opinion of this impressive Android tablet right here.

screen.jpg

"Dell's Venue 8 7000 is the thinnest tablet around, and that's not even the most exciting thing about it. This premium Android slate packs a Moorefield-based Atom processor with quad x86 cores, a RealSense camera that embeds 3D depth data into still images, and a staggeringly beautiful OLED display that steals the show. Read on for our take on a truly compelling tablet."

Here are some more Mobile articles from around the web:

Mobile

Intel Sheds Its Remaining Stake In Imagination Technologies

Subject: General Tech | February 25, 2015 - 08:56 PM |
Tagged: PowerVR, Intel, Imagination Technologies, igp, finance

Update: Currency exchange rates have been corrected. I'm sorry for any confusion!

Intel Foundation is selling off its remaining stake in UK-based Imagination Technologies (IMG.LN). According to JP Morgan, Intel is selling off 13.4 million shares (4.9% of Imagination Technologies) for 245 GBp each. Once all shares are sold, Intel will gross just north of $50.57 Million USD.

PowerVR Rogue Series6XT GPU.png

Imagination Technologies' PowerVR Rogue Series 6XT GPU is used in Apple's A8-series chips.

Intel first invested in Imagination Technologies back in October of 2006 in a deal to gain access to the company’s PowerVR graphics IP portfolio. Since then, Intel has been slowly moving away from PowerVR graphics in favor of it’s own internal HD graphics GPUs. (Further, Intel sold off 10% of its IMG.LN stake in June of last year.) Even Intel’s low cost Atom line of SoCs has mostly moved to Intel GPUs with the exception of the mobile Merrifield and Moorefield” smartphone/tablet SoCs.

The expansion of Intel’s own graphics IP combined with Imagination Technologies acquisition of MIPS are reportedly the “inevitable” reasons for the sale. According to The Guardian, industry analysts have speculated that, as it stands, Intel is a minor customer of Imagination Technologies at less than 5% for graphics (a licensing agreement signed this year doesn’t rule out PowerVR graphics permanently despite the sale). Imagination Technologies still has a decent presence in the mobile (ARM-based) space with customers including Apple, MediaTek, Rockchip, Freescale, and Texas Instruments.

Currently, the company’s stock price is sitting at 258.75 GBp (~$3.99 USD) which seems to indicate that the Intel sell off news was “inevitable” and was already priced in or simply does not have investors that concerned.

What do you think about the sale? Where does this leave Intel as far as graphics goes? Will we see Intel HD Graphics scale down to smartphones or will the company go with a PowerVR competitor? Would Intel really work with ARM’s Mali, Qualcomm’s Adreno, or Samsung’s rumored custom GPU cores? On that note, an Intel powered smartphone with NVIDIA Tegra graphics would be amazing (hint, hint Intel!)

Don't forget the 1TB Crucial BX100 costs less than $400

Subject: Storage | February 23, 2015 - 05:25 PM |
Tagged: ssd, SM2246EN, sata, micron, crucial, BX100, 1TB

It has been about a week since Al posted his review of the 256GB and 512GB models of the Crucial BX100 and what better way to remind you than with a review of the 1TB model, currently a mere $380 on Amazon (or only $374 on BHPhoto.com!).  Hardware Canucks cracked open the 1TB budget priced consumer level SSD for your enjoyment right here, as well as running it through a gamut of tests. As expected their results are in line with the 512GB model as they both use a 4 channel controller, which does mean they are slower than some competitors drives.  On the other hand the BX100 also has a significantly lower price making the 1TB model much more accessible for users.  Check out their post here.

board2_sm.jpg

"Crucial's BX100 combines performance, endurance and value into one awesome budget-friendly SSD The best part? The 1TB version costs just $400."

Here are some more Storage reviews from around the web:

Storage

Intel Plans 7nm in 2018

Subject: Processors | February 26, 2015 - 10:27 PM |
Tagged: Intel, 14nm, 10nm, 7nm

In the PC industry, our CPUs are beginning to appear at 14nm while graphics processors have been at 28nm for a few years. Smaller features allow for more complicated circuits in the same area, which allows for less power, less heat, and more products to be created from a single wafer (assuming you can keep defects to a minimum). Intel expects to release 10nm in late 2016 (possibly slipping into early 2017) and has just announced plans for 7nm in 2018.

Intel-logo.png

According to Ars Technica, this 7nm process is expected to move beyond silicon FinFETs. At room temperature, a 7nm structure of Silicon is a lattice that is approximately 14 atoms wide. Intel was quiet with the details, but Ars expects that “III-V transistors” will be the next stage -- semiconductors made from alloys of Group III and Group V metals. One example of a III-V semiconductor is Indium Gallium Arsenic. Indium and Gallium are Group III while Arsenic is Group V. Apart from using a new material for transistors, it is speculated that Intel might change the way that they package chips into a 2.5D or 3D configuration (maybe even depending on the use case).

Source: Ars Technica

Epic Games Announces "Unreal Dev Grants"

Subject: General Tech | February 21, 2015 - 07:00 AM |
Tagged: unreal engine 4, unreal engine, epic games

On Thursday, Tim Sweeney joined the Unreal Engine 4 Twitch Broadcast to announce “Unreal Dev Grants”. In short, Epic Games have set aside 5 million dollars to pass out in increments of five thousand ($5000 USD) to fifty thousand dollars ($50,000 USD), with no strings attached. If you are doing something cool in, with, or involving Unreal Engine 4, you are eligible and can use the money in any way. You keep all your “intellectual property” and equity, and you do not even have any accountability requirements.

epic-ue4-dev-grants.jpg

It's free money that you can apply for, or they will even approach you with if they see you doing something awesome (you can even nominate other people's projects). The only “catch” is that your work needs to be relevant to Unreal Engine. From there, it could be anything from congratulating an awesome pull request for the engine on GitHub, to giving an indie (or even AAA) game a little bit of a financial boost. Tim Sweeney was telling stories about mowing lawns for the $3000 it took for him to launch ZZT. He mowed lawns so you don't have to.

For more information, or to nominate yourself or someone else, check out their website.

Source: Epic Games