Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

Contest: Win a 400GB Intel 750 Series SSD from Intel and PC Perspective!

Subject: Editorial | May 29, 2015 - 12:37 PM |
Tagged: SSD 750, PCI Express, NVMe, Intel, giveaway, contest, 750 series

PC Perspective and Intel are partnering together to offer up a giveaway with some pretty impressive swag. Surely by now you have read all about the new Intel SSD 750 Series of products, a new class of solid state drive that combines four lanes of PCI Express 3.0 and a new protocol called NVM Express (NVMe) for impressive bandwidth throughput. In Allyn's review of the SSD in April he called it "the obvious choice for consumers who demand the most from their storage" and gave it a PC Perspective Editor's Choice Award!

contest1.jpg

Thanks to our friends at Intel we are going to be handing out a pair of the 400GB add-in card models to loyal PC Perspective readers and viewers. How can you enter? The rules are dead simple:

  1. Fill out the contest entry form below to find multiple entry methods including reading our review, answering a question about Intel SSD 750 Series specs or following us on Twitter. You can fill out one or all of the methods - the more you do the better your chances!
     
  2. Leave a comment on the news post below thanking Intel for sponsoring PC Perspective and for supplying this hardware for us to give to you!
     
  3. This is a global contest - so feel free to enter from anywhere in the world!
     
  4. Contest will close on June 2nd, 2015.

Win an Intel SSD 750 Series From PC Perspective and Intel!

Our most sincere thanks to Intel for bringing this contest to PC Perspective's readers and fans. Good luck to everyone (except Josh)!

Sponsored by Intel

contest2.jpg

Product Specifications

Capacity Seqential 128KB Read (up to MB/s) Sequential 128KB Write (up to MB/s) Random 4KB Read (up to IOPS) Random 4KB Write (up to IOPS) Form Factor Interface
400 GB 2,200 900 430,000 230,000 2.5-inch x 15mm PCI Express Gen3 x4
1.2 TB 2,400 1,200 440,000 290,000 2.5-inch x 15mm PCI Express Gen3 x4
400 GB 2,200 900 430,000 230,000 Half-height half-length (HHHL) Add-in Card PCI Express Gen3 x4
1.2 TB 2,400 1,200 440,000 290,000 Half-heigh half-length (HHHL) Add-in Card PCI Express Gen3 x4

Experience the future of storage performance for desktop client and workstation users with the Intel® SSD 750 Series. The Intel SSD 750 Series delivers uncompromised performance by utilizing NVM Express* over four lanes of PCIe* 3.0.

With both Add-in Card and 2.5-inch form factors, the Intel SSD 750 Series eases migration from SATA to PCIe 3.0 without power or thermal limitations on performance. The SSD can now deliver the ultimate in performance in a variety of system form factors and configurations.

Source: Intel.com

Author:
Manufacturer: NVIDIA

Specifications

When NVIDIA launched the GeForce GTX Titan X card only back in March of this year, I knew immediately that the GTX 980 Ti would be close behind. The Titan X was so different from the GTX 980 when it came to pricing and memory capacity (12GB, really??) that NVIDIA had set up the perfect gap with which to place the newly minted GTX 980 Ti. Today we get to take the wraps off of that new graphics card and I think you'll be impressed with what you find, especially when you compare its value to the Titan X.

Based on the same Maxwell architecture and GM200 GPU, with some minor changes to GPU core count, memory size and boost speeds, the GTX 980 Ti finds itself in a unique spot in the GeForce lineup. Performance-wise it's basically identical in real-world game testing to the GTX Titan X, yet is priced $350 less that that 12GB behemoth. Couple that with a modest $50 price drop in the GTX 980 cards and you have all markers of an enthusiast graphics card that will sell as well as any we have seen in recent generations.

P1020283.jpg

The devil is in all the other details, of course. AMD has its own plans for this summer but the Radeon R9 290X is still sitting there at a measly $320, undercutting the GTX 980 Ti by more than half. NVIDIA seems to be pricing its own GPUs as if it isn't even concerned with what AMD and the Radeon brand are doing. That could be dangerous if it goes on too long, but for today, can the R9 290X put up enough fight with the aging Hawaii XT GPU to make its value case to gamers on the fence?

Will the GeForce GTX 980 Ti be the next high-end GPU to make a splash in the market, or will it make a thud at the bottom of the GPU gene pool? Let's dive into it, shall we?

Continue reading our review of the new NVIDIA GeForce GTX 980 Ti 6GB Graphics Card!!

Rumor: Intel Core i7-6700K (Skylake-S) Benchmarks Leaked

Subject: Processors | May 28, 2015 - 03:44 PM |
Tagged: Intel, Skylake, skylake-s, haswell, devil's canyon

For a while, it was unclear whether we would see Broadwell on the desktop. With the recently leaked benchmarks of the Intel Core i7-6700K, it seems all-but-certain that Intel will skip it and go straight to Skylake. Compared to Devil's Canyon, the Haswell-based Core i7-4790K, the Skylake-S Core i7-6700K has the same base clock (4.0 GHz) and same full-processor Turbo clock (4.2 GHz). Pretty much every improvement that you see is pure performance per clock (IPC).

intel-2015-cpumonkey-skylakes-benchmark.png

Image Credit: CPU Monkey

In multi-threaded applications, the Core i7-6700K tends to get about a 9% increase while, when a single core is being loaded, it tends to get about a 4% increase. Part of this might be the slightly lower single-core Turbo clock, which is said to be 4.2 GHz instead of 4.4 GHz. There might also be some increased efficiency with HyperThreading or cache access -- I don't know -- but it would be interesting to see.

I should note that we know nothing about the GPU. In fact, CPU Monkey fails to list a GPU at all. Intel has expressed interest in bringing Iris Pro-class graphics to the high-end mainstream desktop processors. For someone who is interested in GPU compute, especially with Explicit Unlinked MultiAdapter in DirectX 12 upcoming, it would be nice to see GPUs be ubiquitous and always enabled. It is expected to have the new GT4e graphics with 72 compute units and either 64 or 128MB of eDRAM. If clocks are equivalent, this could translate well over a teraflop (~1.2 TFLOPs) of compute performance in addition to discrete graphics. In discrete graphics, that would be nearly equivalent to an NVIDIA GTX 560 Ti.

We are expecting to see the Core i7-6700K launch in Q3 of this year. We'll see.

Source: CPU Monkey

PCPer Live! LogitechG Showcase, Q&A and Giveaway!

Subject: General Tech | May 26, 2015 - 02:40 PM |
Tagged: video, pcper live, logitechg, logitech, live, giveaway, contest

UPDATE: Did you miss our live stream yesterday? Sorry to hear that! But the good news is that you can still watch the showcase and hear the questions (and answers) on the LogitechG product line. Enjoy!

If you haven't looked up what LogitechG has done lately, you are missing out on some of the most interesting and innovative gaming accessories in the market. My reconnection with the Logitech brand started in July of last year when they sent over the G402 Hyperion Fury gaming mouse, a device with a pair of sensors capable of tracking speeds as high as 28.4 mph - seriously!

Since then the team has been releasing new products and updated software that shows a dedication and drive to the enthusiast and gaming markets. The company's Intelligent Illumination system, that allows game developers to change the colors and brightness of the G910 Orion Spark based on what is happening in the game. Is your health critical? Then maybe you want the keyboard lights to pulse bright red? Or maybe when you jump into a flying vehicle you would like to illuminate the appropriate control keys?  All of this is possible.

G910_DETAIL1.jpg

Besides the cool technology they have to show off, the truth is that Chris, the LogitechG peripheral guru that will be joining me, knows almost everything about keyboards and mice and headsets (oh my!). If you have a question about a specific LogitechG product, or about mice and keyboard technologies in general, this is the guy to ask and the outlet to do it.

pcperlive.png

LogitechG Showcase, Q&A and Giveaway!!

12pm PT / 3pm ET - May 27th

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

And of course, what would a live stream like this be without some giveaways? Logitech is going to supply hardware to give out during the live stream so you are going to want to be in attendance for sure! But to help build up some excitement, we are going to be giving away combinations of the G402 Hyperion Fury mouse and G440 mousing pad every weekday between now and the live stream! To sign up for the daily giveaways, use the form below. And for a chance to win a whole lot more gaming hardware, make sure you tune in on May 27th at 3pm ET / 12pm PT!!

402+440.jpg

LogitechG Live Stream Promotional Giveaway!

If you have questions for Chris from Logitech, about mice, keyboards, switches, RGB illumination, tracking technologies - anything - let us know in the comments!

Author:
Manufacturer: NVIDIA

SHIELD Specifications

Announced just this past June at last year’s Google I/O event, Android TV is a platform developed by Google, running Android 5.0 and higher, that aims to create an interactive experience for the TV. This platform can be built into a TV directly as well as into set-top style boxes, like the NVIDIA SHIELD we are looking at today. The idea is to bring the breadth of apps and content to the TV through the Android operating system in a way that is both convenient and intuitive.

NVIDIA announced SHIELD back in March at GDC as the first product to use the company’s latest Tegra processor, the X1. This SoC combines an 8-core big.LITTLE ARM processor design with a 256-core implementation of the NVIDIA Maxwell GPU architecture, providing GPU performance previously unseen in an Android device. I have already spent some time with the NVIDIA SHIELD at various events and the promise was clearly there to make it a leading option for Android TV adoption, but obviously there were questions to be answered.

DSC01740.jpg

Today’s article will focus on my early impressions with the NVIDIA SHIELD, having used it both in the office and at home for a handful of days. As you’ll see during the discussion there are still some things to be ironed out, some functionality that needs to be added before SHIELD and Android TV can really be called a must-buy product. But I do think it will get there.

And though this review will focus on the NVIDIA SHIELD, it’s impossible not to marry the success of SHIELD with the success of Google’s Android TV. The dominant use case for SHIELD is as a media playback device, with the gaming functionality as a really cool side project for enthusiasts and gamers looking for another outlet. For SHIELD to succeed, Google needs to prove that Android TV can improve over other integrated smart TV platforms as well as other set-top box platforms like Boxee, Roku and even the upcoming Apple TV refresh.

But first, let’s get an overview of the NVIDIA SHIELD device, pricing and specifications, before diving into my experiences with the platform as a whole.

Continue reading our review of the new NVIDIA SHIELD with Android TV!!

Rumor: Entire Upcoming AMD Radeon 300-Series to Be Rebrands of Existing 200-Series?

Subject: Graphics Cards | May 26, 2015 - 05:29 PM |
Tagged: rumors, leaks, Hawaii XT, Fiji, amd radeon

This sounds like disappointing news, but it might be a little misleading as well. First of all the report, another from the folks at VideoCardz, details information about these upcoming GPUs from a leaked driver.

AMD_Radeon_480.png

Here’s the list VideoCardz came up with (200-series name on the right):

  • AMD6658.1 AMD Radeon(TM) R7 360 Graphics Bonaire XTX [Radeon R7 260X]
  • AMD67B0.1 AMD Radeon R9 300 Series Hawaii XT [Radeon R9 290X]
  • AMD67B1.1 AMD Radeon R9 300 Series Hawaii PRO [Radeon R9 290]
  • AMD6810.1 AMD Radeon(TM) R7 370 Graphics Curacao XT [Radeon R9 270X]
  • AMD6810.2 AMD Radeon (TM) R7 300 Series Curacao XT [Radeon R9 270X]
  • AMD6811.1 AMD Radeon (TM) R7 300 Series Curacao PRO [Radeon R9 270/370]
  • AMD6939.1 AMD Radeon R9 300 Series Tonga PRO [Radeon R9 285]

VideoCardz further comments on this list:

“FIJI was not included in this driver release. Radeon R9 Hawaii is neither shown as R9 380 nor as R9 390. Right now it’s just R9 300, just like R9 285-rebrand. Both Hawaii cards will get a small bump in clock speeds and most importantly 8GB memory.”

And this is where the news might be misleading, as the “300-series” naming may very well not include a new GPU due to AMD shifting to a new scheme for their upcoming flagship, just as NVIDIA has used the “TITAN” and “TITAN X” names for flagship cards. If the rumored AMD Fiji card (which was recently pictured) is the all-new GPU featuring HBM memory that we’ve all been waiting for, expect iterations of this new card to follow as the technology trickles down to the more affordable segment.

Computex 2015: ASUS 3800R 34″ Ultrawide 3440x1440 IPS 21:9 Curved G-SYNC Monitor

Subject: Displays | June 1, 2015 - 12:21 PM |
Tagged: nvidia, gsync, g-sync, asus, 3800R

At Computex this week ASUS is showing off a prototype of the new ROG 3800R monitor,  a 34-in curved display with a 3440x1440 resolution and G-Sync variable refresh rate capability. ASUS claims on its PCDIY blog that the 21:9 aspect ratio was the one of the "most requested" specifications for a new ROG monitor, followed by a curved design. The result is a gorgeous display:

ROG-34-inch-curved-gaming-monitor.jpg

Here's a list of specifications:

  • 34” optimal dimension for QHD resolutions with 3440×1440 resolution
  • 21:9 ultra-wide aspect ratio for increased immersion and improved horizontal workflow
  • IPS based panel for superior color reproduction, black levels and reduction of color shifting
  • NVIDIA G-SYNC equipped offering smooth, fluid and tear free gaming with improved motion clarity. Additionally equipped with ULMB operating mode for outstanding motion clarity.
  • Frameless design for seamless surround gaming
  • ASUS exclusive GamePlus feature and Turbo Key
  • Ergonomic adjustment including tilt, swivel and height adjustment

Hot damn, we want of these and we want it yesterday! There is no mention of the refresh rate of the display here though we did see information from NVIDIA that ASUS was planning a 34x14 60 Hz screen - but we are not sure this is the same model being shown. And the inclusion of ULMB would normally indicate a refresh rate above 60-75 Hz...

Another interesting note: this monitor appears to include both DisplayPort and HDMI connectivity.

This 34-inch 3800R curved display features wide-viewing angles, a 3440 x 1440 native resolution, and 21:9 aspect ratio. It features NVIDIA® G-SYNC™ display technology to deliver smooth, lag-free visuals. G-SYNC synchronizes the display’s refresh rate to the GPU in any GeForce® GTX™-powered PC to eliminate screen tearing and minimizing display stutter and input lag. This results in sharper, more vibrant images; and more fluid and responsive gameplay. It has extensive connectivity options that include DisplayPort and HDMI.

The above information came from ASUS just a few short hours ago, so you can assume that it is accurate. Could this be the start of panels that integrate dual scalars (G-Sync module plus something else) to offer more connectivity or has the G-Sync module been updated to support more inputs? We'll find out!

LG 27MU67: 27-inch, 4K, IPS, FreeSync Monitor

Subject: Displays | May 29, 2015 - 04:59 PM |
Tagged: LG, ips, freesync, 4k

LG Australia published a product page for their LG 27MU67 monitor, which the rest of the company doesn't seem to acknowledge the existence of. It is still online, even after three days worth of time that someone could have used to pull the plug. This one is interesting for a variety of reasons: it's 4K, it's IPS, and it supports AMD FreeSync. It is also relatively cheap for that combination, being listed at $799 AUD RRP.

lg-2015-freesync4kips-27mu67.jpg

Some websites have converted that to ~$610 to $620 USD, but it might even be less than that. Australian prices are often listed with their federal tax rolled in, which would yield a price that is inflated about 10%. It is possible, though maybe wishful thinking, that this monitor could retail in the ~$500 to $550 price range for the United States (if it even comes to North America). Again, this is a 4K, IPS, FreeSync panel.

Very little is posted on LG's website and thus it is hard to tell how good of an IPS panel this is. It is listed as 99% SRGB coverage, which is good for typical video but not the best if you are working on printed content, such as magazine illustrations. On the other hand, this is a gaming panel, not a professional one. Update (May 29, 2015): It also has 10-bit (per channel) color. It sounds like it is true 10-bit, not just a look-up table, but I should note that it doesn't explicitly say that.

Again, pricing and availability is up in the air, because this is not an official announcement. It is listed to launch in Australia for $799 AUD, though.

Source: LG Australia

NVIDIA G-Sync Update: New Monitors, Windowed Mode, V-Sync Options

Subject: Displays | May 31, 2015 - 06:00 PM |
Tagged: nvidia, gsync, g-sync, computex 2015, computex

In conjunction with the release of the new GeForce GTX 980 Ti graphics card today, NVIDIA is making a handful of other announcements around the GeForce brand. The most dramatic of the announcements center around the company's variable refresh monitor technology called G-Sync. I assume that any of you reading this are already intimately familiar with what G-Sync is, but if not, check out this story that dives into how it compares with AMD's rival tech called FreeSync.

First, NVIDIA is announcing a set of seven new G-Sync ready monitors that will be available this summer and fall from ASUS and Acer.

980ti-42.jpg

Many of these displays offer configurations of panels we haven't yet seen in a G-Sync display. Take the Acer X34 for example: this 34-in monitor falls into the 21:9 aspect ratio form factor, with a curved screen and a 3440x1440 resolution. The refresh rate will peak at 75 Hz while also offering the color consistency and viewing angles of an IPS screen. This is the first 21:9, the first 34x14 and the first curved monitor to support G-Sync, and with a 75 Hz maximum refresh it should provide a solid gaming experience. ASUS has a similar model, the PG34Q, though it peaks at a refresh rate of 60 Hz.

ASUS will be updating the wildly popular ROG Swift PG278Q display with the PG279Q, another 27-in monitor with a 2560x1440 resolution. Only this time it will run at 144 Hz with an IPS screen rather than TN, again resulting in improved color clarity, viewing angles and lower eye strain. 

Those of you on the look out for 4K panels with G-Sync support will be happy to find IPS iterations of that configuration but still will peak at 60 Hz refresh - as much a limitation of DisplayPort as anything else though. 

Another technology addition for G-Sync with the 352-series (353-series, sorry!) driver released today is support for windowed mode variable refresh.

gsync-windows.jpg

By working some magic with the DWM (Desktop Window Manager), NVIDIA was able to allow for VRR to operate without requiring a game to be in full screen mode. For gamers that like to play windowed or borderless windowed while using secondary or large displays for other side activities, this is a going to a great addition to the G-Sync portfolio. 

Finally, after much harassment and public shaming, NVIDIA is finally going to allow users the choice to enable or disable V-Sync when your game render rate exceeds the maximum refresh rate of the G-Sync monitor it is attached to.

gsyncsettings2.png

One of the complaints about G-Sync has been that it is restrictive on the high side of the VRR window for its monitors. While FreeSync allowed you to selectively enable or disable V-Sync when your frame rate goes above the maximum refresh rate, G-Sync was forcing users into a V-Sync enabled state. The reasoning from NVIDIA was that allowing for horizontal tearing of any kind with G-Sync enabled would ruin the experience and/or damage the technology's reputation. But now, while the default will still be to keep V-Sync on, gamers will be able to manually set the V-Sync mode to off with a G-Sync monitor.

Why is this useful? Many gamers believe that a drawback to V-Sync enabled gaming is the added latency of waiting for a monitor to refresh before drawing a frame that might be ready to be shown to the user immediately. G-Sync fixes this from frame rates of 1 FPS to the maximum refresh of the G-Sync monitor (144 FPS, 75 FPS, 60 FPS) but now rather than be stuck with tear-free, but latency-added V-Sync when gaming over the max refresh, you'll be able to play with tearing on the screen, but lower input latency. This could be especially useful for gamers using 60 Hz G-Sync monitors with 4K resolutions.

Oh, actually one more thing: you'll now be able to enable ULMB (ultra low motion blur) mode in the driver as well without requiring entry into your display's OSD.

gsyncsettings1.png

NVIDIA is also officially announcing G-Sync for notebooks at Computex. More on that in this story!

NVIDIA G-Sync for Notebooks Announced, No Module Required

Subject: Displays, Mobile | May 31, 2015 - 06:00 PM |
Tagged: nvidia, notebooks, msi, mobile, gsync, g-sync, asus

If you remember back to January of this year, Allyn and posted an article that confirmed the existence of a mobile variant of G-Sync thanks to a leaked driver and an ASUS G751 notebook. Rumors and speculation floated around the Internet ether for a few days but we eventually got official word from NVIDIA that G-Sync for notebooks was a real thing and that it would launch "soon." Well we have that day here finally with the beginning of Computex.

mobile1.jpg

G-Sync for notebooks has no clever branding, no "G-Sync Mobile" or anything like that, so discussing it will be a bit more difficult since the technologies are different. Going forward NVIDIA claims that any gaming notebook using NVIDIA GeForce GPUs will be a G-Sync notebook and will support all of the goodness that variable refresh rate gaming provides. This is fantastic news as notebook gaming is often at lower frame rates than you would find on a desktop PC because of lower powered hardware yet comparable (1080p, 1440p) resolution displays.

mobile2.jpg

Of course, as we discovered in our first look at G-Sync for notebooks back in January, the much debated G-Sync module is not required and will not be present on notebooks featuring the variable refresh technology. So what gives? We went over some of this before, but it deserves to be detailed again.

NVIDIA uses the diagram above to demonstrate the complication of the previous headaches presented by the monitor and GPU communication path before G-Sync was released. You had three different components: the GPU, the monitor scalar and the monitor panel that all needed to work together if VRR was going to become a high quality addition to the game ecosystem. 

mobile3.jpg

NVIDIA's answer was to take over all aspects of the pathway for pixels from the GPU to the eyeball, creating the G-Sync module and helping OEMs to hand pick the best panels that would work with VRR technology. This helped NVIDIA make sure it could do things to improve the user experience such as implementing an algorithmic low-frame-rate, frame-doubling capability to maintain smooth and tear-free gaming at frame rates under the panels physical limitations. It also allows them to tune the G-Sync module to the specific panel to help with ghosting and implemention variable overdrive logic. 

980ti-32.jpg

All of this is required because of the incredible amount of variability in the monitor and panel markets today.

mobile4.jpg

But with notebooks, NVIDIA argues, there is no variability at all to deal with. The notebook OEM gets to handpick the panel and the GPU directly interfaces with the screen instead of passing through a scalar chip. (Note that some desktop monitors like the ever popular Dell 3007WFP did this as well.)  There is no other piece of logic in the way attempting to enforce a fixed refresh rate. Because of that direct connection, the GPU is able to control the data passing between it and the display without any other logic working in the middle. This makes implementing VRR technology much more simple and helps with quality control because NVIDIA can validate the panels with the OEMs.

mobile5.jpg

As I mentioned above, going forward, all new notebooks using GTX graphics will be G-Sync notebooks and that should solidify NVIDIA's dominance in the mobile gaming market. NVIDIA will be picking the panels, and tuning the driver for them specifically, to implement anti-ghosting technology (like what exists on the G-Sync module today) and low frame rate doubling. NVIDIA also claims that the world's first 75 Hz notebook panels will ship with GeForce GTX and will be G-Sync enabled this summer - something I am definitely looking forward to trying out myself.

Though it wasn't mentioned, I am hopeful that NVIDIA will continue to allow users the ability to disable V-Sync at frame rates above the maximum refresh of these notebook panels. With most of them limited to 60 Hz (but this applies to 75 Hz as well) the most demanding gamers are going to want that same promise of minimal latency.

mobile6.jpg

At Computex we'll see a handful of models announced with G-Sync up and running. It should be no surprise of course to see the ASUS G751 with the GeForce GTX 980M GPU on this list as it was the model we used in our leaked driver testing back in January. MSI will also launch the GT72 G with a 1080p G-Sync ready display and GTX 980M/970M GPU option. Gigabyte will have a pair of notebooks: the Aorus X7 Pro-SYNC with GTX 970M SLI and a 1080p screen as well as the Aorus X5 with a pair of GTX 965M in SLI and a 3K resolution (2560x1440) screen. 

This move is great for gamers and I am eager to see what the resulting experience is for users that pick up these machines. I have long been known as a proponent of variable refresh displays and getting access to that technology on your notebook is a victory for NVIDIA's team.

Rumor: NVIDIA GeForce GTX 980 Ti Specifications Leaked

Subject: Graphics Cards | May 26, 2015 - 05:03 PM |
Tagged: rumors, nvidia, leaks, GTX 980 Ti, gpu, gm200

Who doesn’t love rumor and speculation about unreleased products? (Other than the manufacturers of such products, of course.) Today VideoCardz is reporting via HardwareBattle a GPUZ screenshot reportedly showing specs for an NIVIDIA GeForce GTX 980 Ti.

gpuz_screenshot.png

Image credit: HardwareBattle via VideoCardz.com

First off, the HardwareBattle logo conveniently obscures the hardware ID (as well as ROP/TMU counts). What is visible is the 2816 shader count, which places it between the GTX 980 (2048) and TITAN X (3072). The 6 GB of GDDR5 memory has a 384-bit interface and 7 Gbps speed, so bandwidth should be the same 336 GB/s as the TITAN X. As far as core clocks on this GPU (which seems likely to be a cut-down GM200), they are identical to those of the TITAN X as well with 1000 MHz Base and 1076 MHz Boost clocks shown in the screenshot.

videocardz_image.PNG

Image credit: VideoCardz.com

We await any official announcement, but from the frequency of the leaks it seems we won’t have to wait too long.

Author:
Manufacturer: Sapphire

Big Things, Small Packages

Sapphire isn’t a brand we have covered in a while, so it is nice to see a new and interesting product drop on our door.  Sapphire was a relative unknown until around the release of the Radeon 9700 Pro days.  This was around the time when ATI decided that they did not want to be so vertically integrated, so allowed other companies to start buying their chips and making their own cards.  This was done to provide a bit of stability for ATI pricing, as they didn’t have to worry about a volatile component market that could cause their margins to plummet.  By selling just the chips to partners, ATI could more adequately control margins on their own product while allowing their partners to make their own deals and component choices for the finished card.

r9_285_itx_01.jpg

ATI had very limited graphics card production of their own, so they often would farm out production to second sources.  One of these sources ended up turning into Sapphire.  When ATI finally allowed other partners to produce and brand their own ATI based products, Sapphire already had a leg up on the competition by being a large producer already of ATI products.  They soon controlled a good portion of the marketplace by their contacts, pricing, and close relationship with ATI.

Since this time ATI has been bought up by AMD and they no longer produce any ATI branded cards.  Going vertical when it come to producing their own chips and video cards was obviously a bad idea, we can look back at 3dfx and their attempt at vertical integration and how that ended for the company.  AMD obviously produces an initial reference version of their cards and coolers, but allows their partners to sell the “sticker” version and then develop their own designs.  This has worked very well for both NVIDIA and AMD, and it has allowed their partners to further differentiate their product from the competition.

r9_285_itx_02.jpg

Sapphire usually does a bang up job on packaging the graphics card. Oh look, a mousepad!

Sapphire is not as big of a player as they used to be, but they are still one of the primary partners of AMD.  It would not surprise me in the least if they still produced the reference designs for AMD and then distributed those products to other partners.  Sapphire is known for building a very good quality card and their cooling solutions have been well received as well.  The company does have some stiff competition from the likes of Asus, MSI, and others for this particular market.  Unlike those two particular companies, Sapphire obviously does not make any NVIDIA based boards.  This has been a blessing and a curse, depending on what the cycle is looking like between AMD and NVIDIA and who has dominance in any particular marketplace.

Click here to read the entire Sapphire R9 285 ITX OC Review!

Rumor: AMD To Name Upcoming Flagship Fiji GPU "Radeon Fury"

Subject: Graphics Cards | May 29, 2015 - 11:05 AM |
Tagged: rumors, radeon, hbm, graphics, gpu, Fury, Fiji, amd

Another rumor has emerged about an upcoming GPU from AMD, and this time it's a possible name for the HBM-powered Fiji card a lot of us have been speculating about.

RAGE-FURY-MAXX.jpg

New box art revealed! (Or a product from early 2000.) Image: VideoCardz

The rumor from VideoCardz via Expreview (have to love the multiple layers of reporting here) states the the new card will be named Radeon Fury:

"Radeon Fury would be AMD’s response to growing popularity of TITAN series. It is yet unclear how AMD is planning to adopt Fury naming schema. Are we going to see Fury XT or Fury PRO? Well, let’s just wait and see. This rumor also means that Radeon R9 390X will be a direct rebrand of R9 290X with 8GB memory."

Of course this is completely unsubstantiated, and Fury is a branding scheme from the ATI days, but who knows? I can only hope that if true, AMD will adopt all caps: TITAN! FURY! Feel the excitement. What do you think of this possible name for the upcoming AMD flagship GPU?

Source: VideoCardz

Rumor: Only Xeon-based Skylake CPUs Getting AVX-512

Subject: Processors | May 27, 2015 - 09:45 PM |
Tagged: xeon, Skylake, Intel, Cannonlake, avx-512

AVX-512 is an instruction set that expands the CPU registers from 256-bit to 512-bit. It comes with a core specification, AVX-512 Foundation, and several extensions that can be added where it makes sense. For instance, AVX-512 Exponential and Reciprocal Instructions (ERI) help solve transcendental problems, which occur in geometry and are useful for GPU-style architectures. As such, it appears in Knights Landing but not anywhere else.

intel-2015-instruction-set-support.png

Image Credit: Bits and Chips

Today's rumor is that Skylake, the successor to Broadwell, will not include any AVX-512 support in its consumer parts. According to the lineup, Xeons based on Skylake will support AVX-512 Foundation, Conflict Detection Instructions, Vector Length Extensions, Byte and Word Instructions, and Double and Quadword Instructions. Fused Multiply and Add for 52-bit Integers and Vector Byte Manipulation Instructions will not arrive until Cannonlake shrinks everything down to 10nm.

The main advantage of larger registers is speed. When you can fit 512 bits of data in a memory bank and operate upon it at once, you are able to do several, linked calculations together. AVX-512 has the capability to operate on sixteen 32-bit values at the same time, which is obviously sixteen times the compute performance compared with doing just one at a time... if all sixteen undergo the same operation. This is especially useful for games, media, and other, vector-based workloads (like science).

This also makes me question whether the entire Cannonlake product stack will support AVX-512. While vectorization is a cheap way to get performance for suitable workloads, it does take up a large amount of transistors (wider memory, extra instructions, etc.). Hopefully Intel will be able to afford the cost with the next die shrink.

$170 for 16GB of very overclockable DDR4-2666

Subject: Memory | May 26, 2015 - 06:22 PM |
Tagged: ddr4-2666, G.Skill, Ripjaws 4

The price may still sting a bit but honestly, it is only about a small premium over many 16GB DDR3 kits so the pricing on DDR4 is getting much better.  G.Skill's 16GB DDR4-2666 quad channel kit has timings of 15-15-15-35 and are fully XMP compliant so getting them out of the clamshell packaging may be the hardest step in installing them.  Of course many readers here, just like at Bjorn3D, are not going to be satisfied with the default settings which brings us to the overclocking results.  3048MHz @ 16-16-16-37 was perfectly stable in their testing at 1.35V and for those who don't mind the long term effects of upping the voltage to 1.4V there is more headroom left. 

G.Skill_Ripjaws_4_2666_41-700x650.jpg

"G.Skill has been churning out enthusiast memory that overclocks like nothing else we’ve ever seen. Pop a set of Ripjaws 4 into your dream machine and settle into the BIOS for an overclocking experience like you’ve never had!"

Here are some more Memory articles from around the web:

Memory

Source: Bjorn3D

Computex 2015: Corsair STRAFE Mechanical Keyboard

Subject: General Tech | June 1, 2015 - 08:07 AM |
Tagged: STRAFE, mechanical keyboard, gaming keyboard, corsair, computex 2015, Cherry MX

Corsair has announced the STRAFE mechanical gaming keyboard featuring Cherry MX switches, and the company is calling it the “most advanced mono-backlit mechanical gaming keyboard available”.

STRAFE_NA_01.jpg

From Corsair:

“The STRAFE mechanical gaming keyboard’s brilliant red backlighting can be customized to a virtually unlimited number of lighting configurations and effects. Each key can be programmed with automated macros using CUE (Corsair Utility Engine) software. Users can choose from six unique lighting effects or craft their own custom profiles and share them on www.corsairgaming.com.”

The STRAFE features:

  • German-made Cherry MX red switches with gold contacts for fast, precise key presses
  • Fully programmable brilliant red LED backlighting for unrivaled personalization
  • USB pass-through port for easy connections
  • Textured and contoured FPS/MOBA keycaps
  • 100% anti-ghosting technology with 104-key rollover
  • Enhanced, easy-access multimedia controls

Strafe_02.jpg

The Corsair STRAFE has an MSRP of $109.99 and will be available in June.

Source: Corsair

Lenovo Tech World: Z41/Z51 and ideapad 100 announced

Subject: Mobile, Shows and Expos | May 27, 2015 - 12:27 AM |
Tagged: z51, z41, tech world, r9 m375, r9 m360, Lenovo, ideapad 100, amd

Today at their Tech World event in Beijing, Lenovo is taking the opportunity to announce some new mainstream notebook options.

z41_images_product_photography_z41_white_standard_02_photomaster_high_res.jpg

First off, we have simply the Lenovo Z41 and Z51. The 14-inch Z41 and 15.6-inch Z51 aim to refresh the previous Z40 and Z50 with Broadwell CPUs as well as new AMD discrete GPU options.

z51_images_product_photography_z51_white_hero_03_win_metro_high_res. METRO.jpg

Lenovo is using the Broadwell-U class of CPUs here as you would find in ultra books, so don't expect a CPU powerhouse, but for productivity style tasks these machines should hit the sweet spot of Price vs Performance with a starting price of $549 for the base Z51.

z51_images_product_photography_z51_white_product_tour_03_high_res.jpg

Paired with the new AMD R9-M360 (Z41) or M375 (Z51) these notebooks should also be able to play mainstream titles on the integrated 1080p display while coming in just over $800. 

The Lenovo Z51 and Z41 are available on Lenovo's site now.

Ideapad_100_14_Black-7-Outlook.jpg

Lenovo also announced a low-cost entry into the ideapad line utilizing Intel's BayTrail-M processors. The ideapad 100 is available in both 14-inch and 15-inch variants and seems to be aimed at the low-cost Chromebook market. 

Ideapad_100_14_Black-10-Metro.jpg

Starting at $249, the ideapad 100 seems like it will be a good option for users looking for a secondary option for basic web browsing and office tasks. 

 

Stay tuned for more from Lenovo's Tech World Event this week!

 

Source: Lenovo

Google I/O 2015: Android OS M Revealed

Subject: Mobile, Shows and Expos | May 29, 2015 - 03:42 PM |
Tagged: Android, google, google io, google io 2015

I'll be honest with you: I did not see a whole lot that interested me out of the Google I/O keynote. The company released a developer preview of their upcoming Android OS “M”, which refers to the thirteenth alphabetical release (although only eleven were formally lettered because they started with “C”upcake). Version nomenclature aside, this release is supposed to tune the experience. While the platform could benefit from a tune-up, it is also synonymous with not introducing major features.

google-2015-android-m-googleio.jpg

But some things are being added, including “Google Now on Tap”. The idea is that Google will understand what is happening on screen and allow the user to access more information about it. In a demo on Engadget, the user was looking at scores for the Golden State Warriors. She asked “When are they playing next”, actually using the pronoun “they”, and the phone brought up their next game (it was against the Cavaliers).

Fingerprint reading and Android Pay are also being added to this release.

Other than that, it is mostly performance and usability. One example is “Doze State”, which allows the OS to update less frequently when the device is inactive. It is supposed to play nice with alarms and notifications though, which is good. Normally, I would wait to see if it actually works before commenting on it, but this seems like something that would only be a problem if no-one thought of it. Someone clearly did, because they apparently mentioned it at the event.

Android M, whatever it will actually be called, is expected to ship to consumers in the Fall.

Source: Tech Report

New server chips from Intel on the way

Subject: General Tech | May 27, 2015 - 12:27 PM |
Tagged: Purley, Intel, Skylake, Cannonlake, Grantley, Romley, knights landing

The Register has obtained a slide describing the next families of Xeon processor to be released by Intel, the Purley platform which includes Skylake.  There are some interesting new developments, including on die interface for either 10Gb/sec Ethernet or 100Gb/sec Omni-Path fabrics which interested the participants at the HPC conference the slides were shown at.  They also mentioned a brand new memory architecture which is described as offering four times the capacity and 500 times the speed than current NAND, all at a lower price per chip which is likely to be somewhat of an exaggeration on their part.  There were also new Phi chips, including the long awaited Knights Landing and workstation chips for use outside the server room.

intel-kdm-roadmap-1.jpg

"A presentation given at a conference on high-performance computing (HPC) in Poland earlier this month appears to have yielded new insight into Intel's Xeon server chip roadmap.

A set of slides spotted by our sister site The Platform indicates that Chipzilla is moving toward a new server platform called "Purley" that will debut in 2017 or later."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Register

Blizzard Releases Overwatch Gameplay Video Ahead of E3

Subject: General Tech, Shows and Expos | May 27, 2015 - 08:33 PM |
Tagged: overwatch, E3 2015, E3, blizzard

Various companies have begun teasing what we might see at this year's E3 expo. Blizzard has not historically had a big presence at the event, though. With the size and scope of Blizzcon, the company usually saves it announcements for then. In fact, I cannot think of a single, non-trivial thing that Blizzard did at E3 since the expo downsized after E3 2006.

This year, on the other hand, Blizzard will be present at AMD and PC Gamer's E3 show. The recent Overwatch previews could be leading up to this event, which takes place three weeks from yesterday. The recent video that I embed above, Tracer, is pretty interesting too. It shows how useful a light assault player could be if they don't obey the space-time continuum. The last two-thirds of the video show off an impressive kill streak.

Again, expect more E3 coverage leading up to the Expo on June 16th.

Source: Blizzard