Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »
Author:
Manufacturer: Various

It's more than just a branding issue

As a part of my look at the first wave of AMD FreeSync monitors hitting the market, I wrote an analysis of how the competing technologies of FreeSync and G-Sync differ from one another. It was a complex topic that I tried to state in as succinct a fashion as possible given the time constraints and that the article subject was on FreeSync specifically. I'm going to include a portion of that discussion here, to recap:

First, we need to look inside the VRR window, the zone in which the monitor and AMD claims that variable refresh should be working without tears and without stutter. On the LG 34UM67 for example, that range is 48-75 Hz, so frame rates between 48 FPS and 75 FPS should be smooth. Next we want to look above the window, or at frame rates above the 75 Hz maximum refresh rate of the window. Finally, and maybe most importantly, we need to look below the window, at frame rates under the minimum rated variable refresh target, in this example it would be 48 FPS.

AMD FreeSync offers more flexibility for the gamer than G-Sync around this VRR window. For both above and below the variable refresh area, AMD allows gamers to continue to select a VSync enabled or disabled setting. That setting will be handled as you are used to it today when your game frame rate extends outside the VRR window. So, for our 34UM67 monitor example, if your game is capable of rendering at a frame rate of 85 FPS then you will either see tearing on your screen (if you have VSync disabled) or you will get a static frame rate of 75 FPS, matching the top refresh rate of the panel itself. If your game is rendering at 40 FPS, lower than the minimum VRR window, then you will again see the result of tearing (with VSync off) or the potential for stutter and hitching (with VSync on).

But what happens with this FreeSync monitor and theoretical G-Sync monitor below the window? AMD’s implementation means that you get the option of disabling or enabling VSync.  For the 34UM67 as soon as your game frame rate drops under 48 FPS you will either see tearing on your screen or you will begin to see hints of stutter and judder as the typical (and previously mentioned) VSync concerns again crop their head up. At lower frame rates (below the window) these artifacts will actually impact your gaming experience much more dramatically than at higher frame rates (above the window).

G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module.

As you can see, the topic is complicated. So Allyn and I (and an aging analog oscilloscope) decided to take it upon ourselves to try and understand and teach the implementation differences with the help of some science. The video below is where the heart of this story is focused, though I have some visual aids embedded after it.

Still not clear on what this means for frame rates and refresh rates on current FreeSync and G-Sync monitors? Maybe this will help.

Continue reading our story dissecting NVIDIA G-Sync and AMD FreeSync!!

Author:
Manufacturer: Futuremark

Our first DX12 Performance Results

Late last week, Microsoft approached me to see if I would be interested in working with them and with Futuremark on the release of the new 3DMark API Overhead Feature Test. Of course I jumped at the chance, with DirectX 12 being one of the hottest discussion topics among gamers, PC enthusiasts and developers in recent history. Microsoft set us up with the latest iteration of 3DMark and the latest DX12-ready drivers from AMD, NVIDIA and Intel. From there, off we went.

First we need to discuss exactly what the 3DMark API Overhead Feature Test is (and also what it is not). The feature test will be a part of the next revision of 3DMark, which will likely ship in time with the full Windows 10 release. Futuremark claims that it is the "world's first independent" test that allows you to compare the performance of three different APIs: DX12, DX11 and even Mantle.

It was almost one year ago that Microsoft officially unveiled the plans for DirectX 12: a move to a more efficient API that can better utilize the CPU and platform capabilities of future, and most importantly current, systems. Josh wrote up a solid editorial on what we believe DX12 means for the future of gaming, and in particular for PC gaming, that you should check out if you want more background on the direction DX12 has set.

3dmark-api-overhead-screenshot.jpg

One of DX12 keys for becoming more efficient is the ability for developers to get closer to the metal, which is a phrase to indicate that game and engine coders can access more power of the system (CPU and GPU) without having to have its hand held by the API itself. The most direct benefit of this, as we saw with AMD's Mantle implementation over the past couple of years, is improved quantity of draw calls that a given hardware system can utilize in a game engine.

Continue reading our overview of the new 3DMark API Overhead Feature Test with early DX12 Performance Results!!

Microsoft Changes Secure Boot Rules with Windows 10, Could Mean OS Lockout

Subject: General Tech | March 22, 2015 - 09:14 PM |
Tagged: windows 10, Secure Boot, microsoft, linux

Secure Boot is a security measure that prevents malware from interfering with the boot process, but it can also prevent unsigned operating systems from booting on the same hardware. While Microsoft’s “Designed for Windows 8” guidelines required manufacturers to permit users to disable the Secure Boot option, the upcoming Windows 10 release will not have this rule in effect. At WinHEC it has been revealed that Windows 10 guidelines leave it up to the OEM to decide if they will allow users to disable UEFI Secure Boot in the system setup, and making this optional presents an interesting question about compatibility with other operating systems. OEM's will be required to ship computers with Secure Boot enabled to comply with “Designed for…” rules, and while they could then choose to provide the option to disable it (currently the required standard), preventing user installation of other OS software could be seen as a way to streamline support by eliminating variables.

windows-10-secure-boot.png

Image Credit: Ars Technica

Why does this matter if most people who purchase a Windows 10 computer will run Windows 10 on it? This could be an issue for someone who wished to either replace that Windows 10 installation with another OS, or simply dual-boot with an OS that didn’t support the Secure Boot feature (which could be a build of Linux or even an older version of Windows). Requiring OS files to contain digital signatures effectively locks out other operating systems without special workarounds or keys, and although open-source operating systems represent a small segment of the market thanks to the way computer hardware is sold to most people, it is concerning to think future hardware could cause a loss of the freedom of choice we have always had with operating systems.

Microsoft enjoys market dominance with Windows thanks to its licensing model (giving it a monopoly on pre-built PC systems that don’t have an Apple or Chrome logo on them), but reportedly began considering possibilities "to assert its intellectual property against Linux or any other open-source software” a decade ago, and this has reached farther than they probably imagined with the adoption of Android (from which Microsoft makes money on every device sold). Is this Secure Boot move nefarious, and does Microsoft consider Linux to be a potential threat to the their desktop market share? It could be that Microsoft would simply like to claim that Windows 10 is the safest version of Windows yet, and that isn’t a bad thing for consumers. Unless they want to easily use another OS on the hardware they purchased, that is.

Source: Ars Technica

PCPer Live! Intel SSD Live Stream and Giveaway!

Subject: General Tech, Storage, Shows and Expos | March 27, 2015 - 06:12 PM |
Tagged: video, sdd, live, Intel, giveaway, contest

Earlier this month we spotted a new and potentially very exciting SSD while looking through some PAX East coverage around the web. It appears to be a PCI Express based Intel SSD, likely based on the same technology as the P3700-series of NVMe drives released last June. And today, if you take a look at this Intel promotional landing page you'll see a timer and countdown that ends on April 2nd.

Sounds like something must be up, huh?

Well, in totally unrelated news, PC Perspective and Intel are partnering together for a live stream to discuss "SSD related topics" on April 2nd.

pcperlive.png

Intel SSD Live Stream and Giveaway

12pm PT / 3pm ET - April 2nd

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

Joining us for the live event will be Intel's Bryn Pilney and Kei Kobayashi, making a follow up appearance after jumping on stage with us at Quakecon 2014. During the event we'll discuss some of the history of Intel's move into the SSD market, how consumers benefit from Intel development and technology and a certain new product that will be making an appearnce on that same day.

And of course, what's a live stream event without some hardware to give away?!? Here's what we have on the docket for those that attend:

  • 2 x Intel 180GB 530 Series SSDs
  • 2 x Intel 480GB 730 Series SSDs
  • 2 x Intel Unreleased SSDs (??)

ssd730.jpg

Huge thanks to Intel for supporting our viewers and readers with hardware to giveaway!

The event will take place Thursday, April 2nd at 3pm ET / 12pm PT at http://www.pcper.com/live. There you’ll be able to catch the live video stream as well as use our chat room to interact with the audience, asking questions for me and Intel to answer live. To win the prizes you will have to be watching the live stream, with exact details of the methodology for handing out the goods coming at the time of the event.

ssdpromo2.jpg

If you have questions, please leave them in the comments below and we'll look through them just before the start of the live stream. Of course you'll be able to tweet us questions @pcper and we'll be keeping an eye on the IRC chat as well for more inquiries. What do you want to know and hear from Intel?

So join us! Set your calendar for this coming Thursday at 3pm ET / 12pm PT and be here at PC Perspective to catch it. If you are a forgetful type of person, sign up for the PC Perspective Live mailing list that we use exclusively to notify users of upcoming live streaming events including these types of specials and our regular live podcast. I promise, no spam will be had!

Rumor: Intel Is Now Powering Both Surface and Surface Pro

Subject: Processors, Mobile | March 25, 2015 - 09:51 PM |
Tagged: Intel, core m, atom, surface, Surface 2, Windows 8.1, windows 10

The stack of Microsoft tablet devices had high-end Intel Core processors hovering over ARM SoCs, the two separated by a simple “Pro” label (and Windows 8.x versus Windows RT). While the Pro line has been kept reasonably up to date, the lower tier has been stagnant for a while. That is apparently going to change. WinBeta believes that a new, non-Pro Surface will be announced soon, at or before BUILD 2015. Unlike previous Surface models, it will be powered by an x86 processor from Intel, either an Atom or a Core M.

microsoft-surface2.jpg

This also means it will run Windows 8.1.

The article claims, somewhat tongue-in-cheek, that Windows RT is dead. No. But still, the device should be eligible for a Windows 10 upgrade when it launches, unlike the RT-based Surfaces. Whether that is a surprise depends on the direction you view it from. I would find it silly for Microsoft to release a new Surface device, months before an OS update, but design it to be incompatible with it. On the other hand, it would be the first non-Pro Surface to do so. Either way, it was reported.

The “Surface 3”, whatever it will be called, is expected to be a fanless design. VR-Zone expects that it will be similar to the 10.6-inch, 1080p form factor of the Surface 2, but that seems to be their speculation. That is about all that we know thus far.

Source: WinBeta
Subject: Storage
Manufacturer: OCZ Storage Solutions

Introduction, Specifications and Packaging

Introduction:

OCZ has been on a fairly steady release track since their aquisition by Toshiba. Having previously pared down their product lines, taking a minimalist approach, the other half of that cycle has taken place with releases like the OCZ AMD Radeon R7. Today we see another addition to OCZ's lineup, in the form of a newer Vector - the Vector 180 Series:

ocz_vector180_image1.jpg

Today we will run all three available capacities (240GB, 480GB, and 960GB) through our standard round of testing. I've thrown in an R7 as a point of comparison, as well as a hand full of the competition.

Specifications:

OCZ Vector 180 slides - 7.jpg

Here are the specs from OCZ's slide presentation, included here as it gives a good spec comparison across OCZ's SATA product range.

Packaging:

DSC09886.JPG

Standard packaging here. 3.5" adapter bracket and Acronis 2013 cloning software product key included.

Continue reading our review of the new OCZ Vector 180 SSD!!

Acer's Latest 15.6" Chromebook Powered By Core i5 (Broadwell-U) Processor

Subject: General Tech | March 26, 2015 - 01:34 AM |
Tagged: core i5, Chromebook, chrome os, broadwell-u, acer

Acer is adding an updated Chromebook to its education-focused C910 lineup. The new Acer C910-54M1 ups the hardware ante by incorporating a Broadwell-U based Intel Core i5 processor which will make this the fastest Chromebook on the market (for what that's worth). 

This new C910 remains aimed at schools and businesses with a sturdy frame, large (for a Chromebook) 15.6" (up to) 1080p display, and eight hours of battery life. Below the display sits an island style keyboard and a large trackpad. Except for the arrow keys, Acer was able to use "regular" sized keys and did not shrink the shift or backspace keys which can be annoying. A webcam and two large upward facing speakers are also present on the C910.

Acer C910 Broadwell-U Core i5 Powered Chrome OS Chromebook.jpg

External I/O includes:

  • 1 x USB 3.0
  • 1 x USB 2.0
  • 1 x HDMI
  • 1 x SD card reader

The port selection is about what one would expect from a Chromebook, but the inclusion of USB 3.0 is welcome for accessing external storage.

Internally, the C910 Chromebook is powered by a dual core (four threads with Hyper-Threading) Broadwell-U Core i5 5200U processor clocked at 2.2GHz base and up to 2.7GHz Turbo Boost with a 15W TDP and 3MB cache. This particular processor includes Intel HD Graphics 5500 clocked at up to 900 MHz. Other hardware includes 4GB DDR3 memory and a 32GB SSD. Wireless hardware includes 802.11ac Wi-Fi and Bluetooth 4.0. 

Acer's new Chromebook is big and powerful, but will the increased hardware provide a noticeably better Chrome OS experience? Intel (naturally) seems to think so with its push to get Core i3 processors into Chromebooks last year. The Broadwell-U Core i5 should be just as fast (maybe even a bit faster with smoother UX and graphics) while sipping power. The alleged eight hours of battery life is impressive as well considering. The downside, because of course there always is one, is pricing. The C910-54M1 will be available in April with a 1080p display for $500. 

At that price point, it is squarely in budget Windows notebook territory as well as high end convertible (e.g. Bay Trail) tablet territory. It will be interesting to see how it ends up doing compared to the other options which each have their own trade offs.

Are you interested in a Chromebook with a Core i5 processor?

Source: Maximum PC

MSI Unveils First AMD AM3+ Motherboard With USB 3.1

Subject: Motherboards | March 25, 2015 - 08:00 AM |
Tagged: usb 3.1, usb, msi, amd, am3+, 10Gbps

Last week, MSI launched a slew of new USB 3.1 equipped motherboards. Today, the company is releasing more details on one of the AMD-based products: the MSI 970A SLI Krait Edition. This upcoming motherboard is geared towards gamers using AMD FX (AM3+) processors and supports multi-GPU setups (both SLI and CrossFire). The 970A SLI Krait Edition has a black and white color scheme with rich expansion options and large aluminum heatsinks over the VRMs and northbridge.

MSI 970A SLI Krait Edition Motherboard.png

The AM3+ processor socket sits to the left of four DDR3 memory slots. Six expansion slots take up the majority of the lower half of the board and include two PCI-E x16, two PCI-E x1, and two PCI slots. Six SATA ports occupy the bottom-right corner with four at 90-degree angles. MSI is using its latest “Military Class 4” capacitors and other hardware along with gold audio traces connecting the rear IO audio jacks to the onboard sound chip.

Speaking of rear IO, you will find the following ports on the 970A SLI Krait Edition.

  • 2 x PS/2
  • 6 x USB 2.0
  • 2 x USB 3.1
  • 1 x Gigabit Ethernet
  • 6 x Analog Audio

The main feature that MSI is pushing with this new board is the addition of two USB 3.1 (Type A) ports to the AMD platform. This is the first AM3+ motherboard to support the faster standard – up to 10 Gbps using an Asmedia ASM1352R controller – while also being backwards compatible with older USB 3.0 and USB 2.0 devices.

MSI has not yet released pricing or availability, but expect it to launch soon for less than $100.

Josh's Thoughts

Few specifications have been released about this board so far, as well as no timetable for the launch.  It is a finished product and should be out "soon" as Tim mentioned.

There are a few things we can gather from the photo of the board.  The audio solution is not nearly as robust as we saw with the 970 Gaming motherboard.  I doubt it will have the headphone amplification, and the filtering is going to be less due to fewer caps used.  The audio is still physically isolated on the PCB, but it has not received the same focus as what we saw on 970 Gaming.

It looks like it is a full 8+2 power phase implementation, as it is taking up more space on the board than the 6+2 unit on the 970 Gaming did.  This should allow for a greater selection of CPUs to be used, as well as potentially greater overclocking ability.  It does not feature a separate SATA controller, so all 6 SATA ports on the board are handled by the SB950.  There are no external e-SATA ports, which really is not a big deal as those are rarely used.

This looks to be a nice addition to the fading AM3+ market.  For those holding onto their AMD builds and wish to upgrade, this looks to be an inexpensive option with next generation connectivity.  MSI looks to have paid the licensing fee necessary to support SLI, plus they utilize the same AMD 970 chipset on the 970 Gaming that is not supposed to be able to split the 1 x 16X PEG connection to 2 x 8X slots.  Some interesting design and chippery are required to that.

Source: MSI
Author:
Subject: Mobile
Manufacturer: Dell

Specifications

The perfect laptop; it is every manufacturer’s goal. Obviously no one has gotten there yet (or we would have all stopped writing reviews of them). At CES this past January, we got our first glimpse of a new flagship Ultrabook from Dell: the XPS 13. It got immediate attention for some of the physical characteristics it included, like an ultra-thin bezel and a 13-in screen in the body of a typical 11-in laptop, all while being built in a sleek thin and light design. It’s not a gaming machine, despite what you might remember from the XPS line, but the Intel Core-series Broadwell-U processor keeps performance speedy in standard computing tasks.

01.jpg

As a frequent traveler that tends to err on the side of thin and light designs, as opposed to high performance notebooks with discrete graphics, the Dell XPS 13 is immediately compelling on a personal level as well. I have long been known as a fan of what Lenovo builds for this space, trusting my work machine requirements to the ThinkPad line for years and year. Dell’s new XPS 13 is a strong contender to take away that top spot for me and perhaps force me down the path of an upgrade of my own. So, you might consider this review as my personal thesis on the viability of said change.

The Dell XPS 13 Specifications

First, make sure as you hunt around the web for information on the XPS 13 that you are focusing on the new 2015 model. Much like we see from Apple, Dell reuses model names and that can cause confusion unless you know what specifications to look for or exactly what sub-model you need. Trust me, the new XPS 13 is much better than anything that existed before.

Continue reading our review of the Dell XPS 13 Notebook!

Looking for an Android emulator?

Subject: General Tech | March 23, 2015 - 01:08 PM |
Tagged: DuOS, Android

Be it for development reasons or a serious addiction to a certain game, there are those who find themselves wanting to run Android on a device larger than a phone or tablet.  For a mere $10 you can pick up DuOS, a supported Android emulator which will run on a PC, or at least a modern Intel powered machine as the emulator uses VT-x to run properly which is a shame for AMD users.  It will not run on ARM hardware but it is ARM v-7 compliant so you can use applications designed for the chip that most mobile hardware runs on with DuOS.  The footprint on your machine is tiny, 16GB maximum and you can root the installation very quickly if you are looking to test your modifications to the Android OS before installing them on your phone.  Techgage does mention that there is no help menu and you will need to update the application manually so plan to spend some time on the DuOS forums if you will be making use of this software.

AMIDuOS-Main-Android-Desktop-680x384.png

"With so many devices out there, on many different operating systems, deciding which one you should purchase can be difficult. DuOS helps lessen that burden by providing an Android tablet experience on your Windows computer. Does DuOS step beyond the barriers that currently divide these OS’s, or are we still locked into one OS per device?"

Here is some more Tech News from around the web:

Tech Talk

 

Source: Techgage

Intel / Micron Announce 3D NAND Production with Industry's Highest Density: >10TB on a 2.5" SSD

Subject: Storage | March 26, 2015 - 02:12 PM |
Tagged: storage, ssd, planar, nand, micron, M.2, Intel, imft, floating-gate, 3d nand

Intel and Micron are jointly announcing new 3D NAND technology that will radically increase solid-storage capacity going forward. The companies have indicated that moving to this technology will allow for the type of rapid increases in capacity that are consistent with Moore’s Law.

IMFT_Slide_1.png

The way Intel and Micron are approaching 3D NAND is very different from existing 3D technologies from Samsung and now Toshiba. The implementation of floating-gate technology and “unique design choices” has produced startling densities of 256 Gb MLC, and a whopping 384 Gb with TLC. The choice to base this new 3D NAND on floating-gate technology allows development with a well-known entity, and benefits from the knowledge base that Intel and Micron have working with this technology on planar NAND over their long partnership.

What does this mean for consumers? This new 3D NAND enables greater than 10TB capacity on a standard 2.5” SSD, and 3.5TB on M.2 form-factor drives. These capacities are possible with the industry’s highest density 3D NAND, as the >3.5TB M.2 capacity can be achieved with just 5 packages of 16 stacked dies with 384 Gb TLC.

vnand crop.png

A 3D NAND cross section from Allyn's Samsung 850 Pro review

While such high density might suggest reliance on ever-shrinking process technology (and the inherent loss of durability thus associated) Intel is likely using a larger process for this NAND. Though they would not comment on this, Intel could be using something roughly equivalent to 50nm flash with this new 3D NAND. In the past die shrinks have been used to increase capacity per die (and yields) such as IMFT's move to 20nm back in 2011, but with the ability to achieve greater capacity vertically using 3D cell technology a smaller process is not necessary to achieve greater density. Additionally, working with a larger process would allow for better endurance as, for example, 50nm MLC was on the order of 10,000 program/erase cycles. Samsung similarly moved to a larger process with with their initial 3D NAND, moving from their existing 20nm technology back to 30nm with 3D production.

IMFT_Slide_2.png

This announcement is also interesting considering Toshiba has just entered this space as well having announced 48-layer 128 Gb density 3D NAND, and like Samsung, they are moving away from floating-gate and using their own charge-trap implementation they are calling BiCS (Bit Cost Scaling). However with this Intel/Micron announcement the emphasis is on the ability to offer a 3x increase in capacity using the venerable floating-gate technology from planar NAND, which gives Intel / Micron an attractive position in the market - depending on price/performance of course. And while these very large capacity drives seem destined to be expensive at first, the cost structure is likely to be similar to current NAND. All of this remains to be seen, but this is indeed promising news for the future of flash storage as it will now scale up to (and beyond) spinning media capacity - unless 3D tech is implemented in hard drive production, that is.

IMFT_Slide_3.png

So when will Intel and Micron’s new technology enter the consumer market? It could be later this year as Intel and Micron have already begun sampling the new NAND to manufacturers. Manufacturing has started in Singapore, plus ground has also been broken at the IMFT fab in Utah to support production here in the United States.

Source: Intel

Pixar Renderman Is Free for Non-Commercial Use

Subject: General Tech | March 24, 2015 - 09:44 PM |
Tagged: renderman, pixar, disney

Well that's a surprise. Pixar Renderman, the software used and developed by Pixar to turn computer-represented geometry into wonderful images, is now free for non-commercial use. This is not quite as free as Unreal Engine 4, which does not require royalties for rendered audio/video at all because the content is viewed without the engine, but free for non-commercial is still a big deal considering what Renderman is. While Pixar is known for their movies, they are very much software engineers.

pixar-mike-scully.jpg

Thumbs up all around.

Image Credit: Pixar via Animation Magazine

Currently the rendering package integrates with Autodesk Maya and KATANA by The Foundry. Pixar has Cinema4D and Houdini listed as “in development”. They also claim to be interested in 3D Studio Max, because of course they are, as well as Modo, Rhino, Lightwave, and Blender.

Yay Blender!

The above list only considers dedicated plug-ins. Pixar Renderman can also be run as a standalone application that accepts their file format, “RIB”, from a command-line interface. There has been many of these for Blender and other 3D suites over the last basically forever. A plug-in is nicer for artists however, and it is good to see Pixar is not afraid of open-source suites to pair with their proprietary rendering package. Also, "Blenderman" is a hilarious name.

Source: Pixar

Saved so much using Linux you can afford a Titan X?

Subject: Graphics Cards | March 27, 2015 - 04:02 PM |
Tagged: gtx titan x, linux, nvidia

Perhaps somewhere out there is a Linux user who wants a TITAN X and if there is they will like the results of Phoronix's testing.  The card works perfectly straight out of the box with the latest 346.47 driver as well as the 349.12 Beta; if you want to use Nouveau then don't buy this card.  The TITAN did not win any awards for power efficiency but for OpenCL tests, synthetic OpenGL benchmarks and Unigine on Linux it walked away a clear winner.  Phoronix, and many others, hope that AMD is working on an updated Linux driver to accompany the new 300 series of cards we will see soon to help them be more competitive on open source systems.

If you are sick of TITAN X reviews by now, just skip to their 22 GPU performance roundup of Metro Redux.

image.php_.jpg

"Last week NVIDIA unveiled the GeForce GTX TITAN X during their annual GPU Tech Conference. Of course, all of the major reviews at launch were under Windows and thus largely focused on the Direct3D performance. Now that our review sample arrived this week, I've spent the past few days hitting the TITAN X hard under Linux with various OpenGL and OpenCL workloads compared to other NVIDIA and AMD hardware on the binary Linux drivers."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: Phoronix

Windows Apps Still Smell Like Windows RT

Subject: General Tech | March 25, 2015 - 06:23 PM |
Tagged: microsoft, windows, windows 10, winRT, windows rt

Even though I am really liking the Windows 10 operating system from a technical standpoint, I did not mind Windows 8.x, as software, either. My concern was its promotion of the Windows Store for the exact same reasons that I dislike the iOS App Store. Simply put, for your application to even exist, Microsoft (or Apple) needs to certify you as a developer, which they can revoke at any time, and they need to green light your creations.

windows-10.png

This has a few benefits, especially for Microsoft. First and foremost, it gives them a killswitch for malicious software and their developers. Second, it gives them as much control over the platform as they want. If devices start flowing away from x86 to other instruction sets, like we almost saw a few years ago, then Windows can pick up and go with much less friction than the corner they painted themselves into with Win32.

This also means that developers need to play ball, even for terms that Microsoft is forced to apply because of pressure for specific governments. LGBT groups should be particularly concerned as other platforms are already banning apps that are designed for their members. Others could be concerned about encryption and adult art, even in Western nations. If Microsoft, or someone with authority over them, doesn't want your content to exist: it's gone (unless it can run in a web browser).

On the plus side, I don't see the rule where third-party browser engines are banned anymore. When Windows 8 launched, all browsers needed to be little more than a reskin of Internet Explorer.

Beyond censorship, if Microsoft does not offer a side-loading mechanism for consumers, you also might need to give Microsoft a cut of your sales. You don't even seem to be able to give your app to specific people. If you want to propose to your significant other via a clever app, there does not seem to be a method to share it outside of the Windows Store unless you set up their device as a Window developer ahead of time.

Why do I say all this today? Because Microsoft has branded Universal Apps as Windows apps, and their strategy seems to be completely unchanged in these key areas. What kept me from updating to Windows 8 was not its user interface. It was the same thing that brought me to develop in Web technologies and volunteer for Mozilla.

It was the developer certification and lack of side-loading for modern apps.

I get it. Microsoft is tired of being bullied with crap about how it is insecure and a pain for the general public. At the very least, they need a way for users to opt out, though. What they are doing with Windows 10 is very nice, and I would like to see it as my main operating system, but I need to prioritize alternative platforms if this one is heading in a very dark direction.

Win32 might be a legacy API, but the ability to write what I want should not be.

Source: Thurrott.com

Reliable high volume storage; the 4TB Toshiba MG04ACA400A

Subject: Storage | March 23, 2015 - 03:47 PM |
Tagged: toshiba, MG04ACA400A, datacenter, enterprise

Toshiba's new MG04ACA series are Enterprise class HDDs available in increments of 1TB, from 2TB to 6TB and ship with either 4K or 512B emulation depending on your preference.  Mad Shrimps just wrapped up a review of the 4TB model which certainly cannot match a SSD for speed but it is rated for 1400000 hours and workloads of 550TB a year, constant usage.  You do pay a premium for enterprise level drives but spinning rust is still far more economical in high densities that flash based drives are.  If you are looking for reliable HDDs for your servers, check this review out.

2.jpg

"The new MG04ACA series from Toshiba is composed from drives which are meant for enterprise, mission-critical applications, while sporting higher transfer rates and capacities. The tested sample comes with 128MB of cache and comes in two versions, depending on the applications it is needed for: with 512 sector emulation or strictly with 4K sector. Make sure to choose wisely which drive is for you and your setups in order to bypass any incompatibilities which may arise."

Here are some more Storage reviews from around the web:

Storage

 

Source: Mad Shrimps

Logitech MX Master Mouse Announced

Subject: General Tech | March 27, 2015 - 07:03 AM |
Tagged: mx master, mx, mouse, logitech

In the universe of computer mice, Logitech is one of the best known manufacturers. This one, the Logitech MX Master Wireless Mouse, is not part of their “G Series”. At a price of $99.99 USD, or $119.99 CAD, it is their most expensive offering in that class.

logitech-mx-master-top.jpg

The MX Master is a five button, right handed mouse. While that is not particularly exciting, one interesting feature is the horizontal scrolling tumbler on the thumb rest. The wheel on top scrolls up and down, while the one on the side can scroll left and right (or be reconfigured with Logitech's software). It is also a laser mouse that is capable of tracking on many types of surfaces, including thicker sheets of glass. It can be paired to three separate devices at once, either by Bluetooth or Logitech's proprietary receiver. Its rechargeable battery lasts about 40 days of 6 hour per day usage. Four minutes of charging yields about six hours of usage, and you can apparently even use the mouse while tethered.

The Logitech MX Master will be available in April for $99.99 USD.

Source: Logitech

NVIDIA Quadro M6000 Announced

Subject: Graphics Cards | March 23, 2015 - 07:30 AM |
Tagged: quadro, nvidia, m6000, gm200

Alongside the Titan X, NVIDIA has announced the Quadro M6000. In terms of hardware, they are basically the same component: 12 GB of GDDR5 on a 384-bit memory bus, 3072 CUDA cores, and a reduction in double precision performance to 1/32nd of its single precision. The memory, but not the cache, is capable of ECC (error-correction) for enterprises who do not want a stray photon to mess up their computation. That might be the only hardware difference between it and the Titan X.

nvidia-quadro-m6000.jpg

Compared to other Quadro cards, it loses some double precision performance as mentioned earlier, but it will be an upgrade in single precision (FP32). The add-in board connects to the power supply with just a single eight-pin plug. Technically, with its 250W TDP, it is slightly over the rating for one eight-pin PCIe connector, but NVIDIA told Anandtech that they're confident that it won't matter for the card's intended systems.

That is probably true, but I wouldn't put it past someone to do something spiteful given recent events.

The lack of double precision performance (IEEE 754 FP64) could be disappointing for some. While NVIDIA would definitely know their own market better than I do, I was under the impression that a common workstation system for GPU compute was a Quadro driving a few Teslas (such as two of these). It would seem weird for a company to have such a high-end GPU be paired with Teslas that have such a significant difference in FP64 compute. I wonder what this means for the Tesla line, and whether we will see a variant of Maxwell with a large boost in 64-bit performance, or if that line will be in an awkward place until Pascal.

Or maybe not? Maybe NVIDIA is planning to launch products based on an unannounced, FP64-focused architecture? The aim could be to let the Quadro deal with the heavy FP32 calculations, while the customer could opt to load co-processors according to their double precision needs? It's an interesting thought as I sit here at my computer musing to myself, but then I immediately wonder why did they not announce it at GTC if that is the case? If that is the case, and honestly I doubt it because I'm just typing unfiltered thoughts here, you would think they would kind-of need to be sold together. Or maybe not. I don't know.

Pricing and availability is not currently known, except that it is “soon”.

Source: Anandtech

Corsair Releases Dominator Platinum DDR4 3400MHz Memory Kits

Subject: Memory | March 23, 2015 - 02:08 PM |
Tagged: overclocking, Dominator Platinum Series, ddr4-3400, ddr4, corsair

Speaking of components at the $999 price point, Corsair has just released a RAM kit aimed at the serious overclocker.  The Dominator Platinum Series 16GB DDR4-3400MHz kit now holds the record for fastest overclock at an impressive 4365.6MHz achieved on the Gigabyte X99-SOC board; you will be seeing more of both the motherboard and these DIMMs on this page in the near future. 

If you are looking for RAM that operates well using LN2 and serious overclocking these Corsair DIMMs are currently the best in class on the market.

DOM_DDR4-CUTAWAY-FEAT-ORANGE.jpg

Platinum Series DDR4 3400MHz 16GB memory kits which debuted at CES in January. The new kits are performance tuned to run air-cooled at an incredible 3400MHz and beyond on the Gigabyte X99-SOC Champion motherboard. The memory and motherboard duo together create one of the highest performance enthusiast PC platforms currently available.

Dominator Platinum Series 16GB (4 x 4GB) 3400MHz DDR4 Memory
The fastest DDR4 memory available from Corsair, the Dominator Platinum 3400MHz 16GB (4x4GB, 16-18-18-36) memory kits have a striking industrial with an orange anodized heat spreader that matches the color scheme on Gigabyte SOC motherboards. Like all Dominator Platinum memory modules, the new kits have patented DHX technology for cooler operation, user-swappable colored “light pipes” for customizable downwash lighting, and Corsair Link compatibility for real-time temperature monitoring. Dominator Platinum memory is built with hand-screened ICs, undergoes rigorous performance testing, and incorporates state-of-the-art cooling for reliable performance in demanding environments.

“Each Dominator Platinum 3400MHz DDR4 memory module is built with hand-picked ICs and tuned timing parameters to achieve blistering performance on Gigabyte’s X99-SOC Champion extreme overclocking motherboard,” said Thi La, Chief Operating Officer at Corsair. “Achieving insanely fast memory clock speeds is just the beginning. We can’t wait to see the incredible high-performance machines that PC enthusiasts create with them.”

“Our Gigabyte X99-SOC Champion is engineered with highly optimized trace paths between the processor and DIMM sockets to enable incredible memory clock speeds,” said Colin Brix, Director of Marketing of Gigabyte’s Motherboard Business Unit. “We worked with Corsair to tune an exceptional edition of Dominator Platinum DDR4 that can help overclockers push the X99-SOC Champion to reach unprecedented memory speeds.”

World Record for Fastest DDR4 Memory Frequency
On March 20, professional overclocker Hicookie set the world record for fastest DDR4 memory frequency using the Corsair Dominator 3400MHz DDR4 memory and Gigabyte X99-SOC Champion motherboard. Using liquid nitrogen, Hicookie established a record-breaking speed of 4365.6MHz.

Source: Corsair

Psst, hey buddy I got some nice Android apps for you cheap! They just fell off the back of a truck ya know.

Subject: General Tech | March 25, 2015 - 12:25 PM |
Tagged: Android, security

If you are running a device with Android 4.3 or earlier you should avoid third-party app stores; arguably all users should but there are times when Google Play does not offer what you need.  A security problem with the way that APK files are authenticated during install can allow a seemingly harmless app to be modified, either at the source or while being transmitted, leading to the installation of an app that may not be entirely honest about what it does.  Palo Alto Network's testing shows versions 4.4+ do not suffer from this particular problem nor do the vetted apps at the Google Play store.  It is unlikely you will encounter this problem unless you usually install things from places like Creepy Ice Cream Van Discount Apps and Malware,  but you should be aware of the existence of this issue.  More at The Inquirer.

index.jpg

"A FRESH VULNERABILITY CALLED Android Installer Hijacking is making itself known as a threat to almost half of all Android users."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

Seasonic Snow Silent; if you prefer your PSU meek and pale

Subject: Cases and Cooling | March 23, 2015 - 02:30 PM |
Tagged: seasonic, snow series, modular psu, 1050W, 80 Plus Platinum

The ability to deliver stable power is the primary attribute we look for in a PSU but the aesthetics should not be ignored, especially for modders and those with glass cases.  The Seasonic Snow Silent Series 1050W PSU is aptly named for its white finish which gives it a unique look in a very crowed market but the exterior does not tell the whole story.  With a total of eight 6+2 PCIe plugs and a 12V rail capable of providing 1044W @ 87 amps this PSU will certainly handle Crossfire or SLI and it will do so fairly quietly.  TechPowerUp liked the 7 year warranty as well as the performance and look of this PSU, especially considering it is priced roughly the same as the bog standard grey model which does not have the improved fan of this model.

PSU_top1a.jpg

"Seasonic released a special version of their Platinum offering with 1050 W capacity; it comes with a while paint job and is equipped with an FDB fan and a relaxed fan profile for less noise output, which will thrill users looking for a quiet high-performance PSU."

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

 

Source: techPowerUp