ASUS Avalon concept PC merges desktops and DIY with cable-free mindset

Subject: Graphics Cards, Motherboards, Systems, Shows and Expos | May 30, 2016 - 08:04 AM |
Tagged: crazy people, concept, computex 2016, computex, avalon, asus

If you expected Computex to be bland and stale this year, ASUS has something that is going to change your mind. During the company's Republic of Gamers press conference, it revealed a concept PC design it has been working on dubbed Avalon. The goal of this project was to improve on the fundamental design of the PC; something that hasn't changed for decades. ASUS wanted to show that you could build a platform that would allow DIY machines to be "more modular, easier to build, and more tightly integrated."

system-closed.jpg

The result is a proof of concept design that looks more like a high end turntable than a PC. In reality, you are looking at a machine that has been totally redesigned, from the power supply to motherboard and case integration to cooling considerations and more. ASUS has posted a great story that goes into a lot of detail on Avalon, and it's clear this is a project the team has been working on for some time.

The brainchild of Jonathan Chu, the Avalon concept takes a notebook-like approach to desktop design. The motherboard is designed in conjunction with the chassis to enable more seamless cooperation between the two.

system-open.jpg

The first example of changes to Avalon is something as simple as the front panel connectors on a case. Connecting them to your motherboard is the same today, basically, as it has ever been. But if you are the manufacturer or designer of both the chassis and the motherboard itself, it is trivial to have the buttons, lights and even additional capabilities built into a specific location on the PCB that matches with access points on the case. 

io.jpg

Re-thinking the rear IO panel was another target: making it modular and connected to the system via PCI Express means you can swap connectivity options based on the user's needs. Multiple Gigabit NICs a requirement? Done. Maximum USB capability? Sure. Even better, by making the back panel IO a connected device, it can host storage and sound controllers on its own, allowing for improved audio solutions and flexible data configurations. 

psu.jpg

ASUS even worked in a prototype power supply that is based on the SFX form factor but that uses a server-style edge connector, removing wires from the equation. It then becomes the motherboard's responsibility to distribute power through the other components; which again is easy to work through if you are designing these things in tandem. Installing or swapping a power supply becomes as simple as pulling out a drive tray.

This is all made possible by an internal structure that looks like this:

guts1.jpg

Rethinking how a motherboard is built, how it connects to the outside world and to other components, means that ASUS was able to adjust and change just about everything. The only area that remains the same is for the discrete graphics card. These tend to draw too much power to use any kind of edge connector (though the ASUS story linked above says they are working on a solution) and thus you see short run cables from a break out on the motherboard to the standard ROG graphics card.

system-graphics.jpg

The ASUS EdgeUp story has some more images and details and I would encourage you to check it out if you find this topic compelling; I know I do. There are no prices, no release dates, no plans for sampling yet. ASUS has built a prototype that is "right on the edge of what’s possible" and they are looking for feedback from the community to see what direction they should go next.

Will the DIY PC in 2020 be a completely different thing than we build today? It seems ASUS is asking the same question.

Source: ASUS EdgeUp

Computex 2016: ASUS ROG Rampage V Edition 10: Extreme-performance gaming motherboard

Subject: Motherboards, Shows and Expos | May 30, 2016 - 07:18 AM |
Tagged: ROG, rampage v edition 10, computex 2016, computex, asus

In celebration of 10 years of ASUS ROG motherboards, the company today revealed the new Rampage V Edition 10, an X99 motherboard targeting the release of the Intel Broadwell-E processors that are also set to be announced this week at Computex. This new board has basically every feature and capability an ROG product and buyer could ask for, including more LED and LED control than I know what to do with.

rampagevedition10.jpg

Some more detail from the ASUS press release:

The Rampage V Edition 10 is a celebratory refresh of ROG’s flagship extreme-performance motherboard designed to let gamers and overclockers break every limit.

glow.jpg

Based on the Intel® X99 chipset, the new motherboard sets new industry standards. It features the ultimate RGB lighting scheme with five independently-controlled onboard LED areas plus one 4-pin 5050 RGB header, and all can be synchronized by the all-new Aura software for stunning aesthetics. ROG has also teamed up with well-known RGB strip-makers and case manufacturers, including CableMod, IN WIN, Deepcool, BitFenix, and Phanteks — helping simplify RGB lighting compatibility and control.

The new motherboard is equipped with multiple ASUS exclusive features to aid extreme overclockers. These include Extreme Engine Digi+ voltage-regulator module (VRM) for the cleanest, smoothest power, ASUS-exclusive T-topology technology for maxed-out DDR4 performance, and 5-Way Optimization for easy overclocking and fan tuning with one click.

shield.jpg

The Rampage V Edition 10 also includes multiple technologies to deliver the best gaming experience. The included SupremeFX Hi-Fi audio amplifier ensures flawless audio, dual Intel Gigabit Ethernet and GameFirst combine forces for low-latency networking, and ASUS Safe Slot reinforcement for PCIe connectors to prevent damage from heavy graphics cards. The new board introduces a patent-pending integrated I/O shield for style, easier construction, and enhanced durability.  There’s also a slew of onboard storage and connectivity options, including U.2, M.2, USB 3.1, and 3x3 Wi-Fi.

According to a post on an ASUS sub-site, the board will retail for $599 and should be on the market very soon!

MSI at Computex, a peek before the show

Subject: General Tech, Motherboards, Systems, Shows and Expos | May 25, 2016 - 02:12 PM |
Tagged: msi, computex 2016, GS63 Stealth Pro

MSI offered a sneak peek at the lineup you can expect to see them showcase at Computex and the list is quite long, with some interesting new additions.

gs63.PNG

For laptops you can expect to see the new GS63 Stealth Pro, with a Core i7 6700HQ and GTX970M inside.  The cooling system is also new, a five heatpipe system called the Cooler Boost Trinity with Whirlwind Blades pushing hot air out the exhaust ports.  We should hear more about what this system actually is during the show.

GT83.PNG

The GT83 and GT73 Titan SLI laptops are built with VR in mind, as well as supporting output to multiple monitors and 4K resolutions; though perhaps not both at once.  The GT83 contains desktop class GTX 980s while the GT73 uses the mobile versions, the GTX 980M or a single desktop GTX 980 if you prefer.

gs 73.PNG

The GS73 focuses on a slimmed down design while still incorporating a GTX970M and the aforementioned Cooler Boost Trinity system.  It will also sport a SteelSeries gaming keyboard, an ESS SABRE HiFi headset AMP and Nahimic 2.0 sound system.

backpack.PNG

Something far more unique is the 'Backpack PC', allowing you to strap a Core i7 and GTX 980 to your back so that you are not tied to a desk when using VR.  With that amount of power you will still need mains power as the weight of the battery required to power that system for more than a few minutes would be prohibitive.  On the other hand the cables from your VR headset and controllers would be connected to the backpack which would theoretically direct the cables out of your way.

aegis.PNG

The Aegis Gaming Desktop is a far more familiar desktop machine, though it too offers a nod towards VR usage by locating an HDMI connection at the front of the 19.6L case.  It will also have a Dragon Button, reminiscent of the old Turbo button from the original 8086 processor, which will boost your 'speed and performance' by 15%.  Likely this is an overclocking preset which one assumes can be enabled on the fly.

vortex.PNG

The Vortex G65 SLI desktop is a little less plain, a round case which is a mere 6.5L in volume but still contains two GTX 980s and an i7-6700K, with their proprietary Silent Storm Cooling system.  MSI continues the pattern of building systems around VR compatibility with the Vortex.

cubi2.PNG

Continuing on to their Cubi 2 Plus, a SFF system powered by a Skylake-S class processor a wee 5x5" mini-STX motherboard.  The CPU is not BGA and so can be upgraded and there is enough space in the system for a 2.5" SSD upgrade, albeit just barely.

x99a.PNG

On to their motherboards, first up is the X99A GAMING PRO CARBON which offers a few new features to tempt users to upgrade.  Not only does it have USB Type-C connectors but they are described as being located at the front, presumably on a header. It also sports Audio Boost 3, Turbo M.2 32 Gb/s, SEx ports and Dynamic Mystic Light, an LED systems with software that supports more than 16.8 million colors.

x99a titanium.PNG

For those more concerned with overclocking than having an impressive light show, the X99A XPOWER GAMING TITANIUM features Military Class 5 components and a specially designed thermal system to ensure a solid overclock.  It also has support for U.2 32Gb/s drives.

z170a.PNG

The last of the trio of motherboards will be the Z170A MPOWER GAMING TITANIUM, similar to the X99A model apart from the socket. You will get all the features of the TITANIUM series for your LGA1151 processors.

Expect to see much more information about these products and others once Computex gets underway.

Source: MSI
Manufacturer: NVIDIA

93% of a GP100 at least...

NVIDIA has announced the Tesla P100, the company's newest (and most powerful) accelerator for HPC. Based on the Pascal GP100 GPU, the Tesla P100 is built on 16nm FinFET and uses HBM2.

nvidia-2016-gtc-pascal-banner.png

NVIDIA provided a comparison table, which we added what we know about a full GP100 to:

  Tesla K40 Tesla M40 Tesla P100 Full GP100
GPU GK110 (Kepler) GM200 (Maxwell) GP100 (Pascal) GP100 (Pascal)
SMs 15 24 56 60
TPCs 15 24 28 (30?)
FP32 CUDA Cores / SM 192 128 64 64
FP32 CUDA Cores / GPU 2880 3072 3584 3840
FP64 CUDA Cores / SM 64 4 32 32
FP64 CUDA Cores / GPU 960 96 1792 1920
Base Clock 745 MHz 948 MHz 1328 MHz TBD
GPU Boost Clock 810/875 MHz 1114 MHz 1480 MHz TBD
FP64 GFLOPS 1680 213 5304 TBD
Texture Units 240 192 224 240
Memory Interface 384-bit GDDR5 384-bit GDDR5 4096-bit HBM2 4096-bit HBM2
Memory Size Up to 12 GB Up to 24 GB 16 GB TBD
L2 Cache Size 1536 KB 3072 KB 4096 KB TBD
Register File Size / SM 256 KB 256 KB 256 KB 256 KB
Register File Size / GPU 3840 KB 6144 KB 14336 KB 15360 KB
TDP 235 W 250 W 300 W TBD
Transistors 7.1 billion 8 billion 15.3 billion 15.3 billion
GPU Die Size 551 mm2 601 mm2 610 mm2 610mm2
Manufacturing Process 28 nm 28 nm 16 nm 16nm

This table is designed for developers that are interested in GPU compute, so a few variables (like ROPs) are still unknown, but it still gives us a huge insight into the “big Pascal” architecture. The jump to 16nm allows for about twice the number of transistors, 15.3 billion, up from 8 billion with GM200, with roughly the same die area, 610 mm2, up from 601 mm2.

nvidia-2016-gp100_block_diagram-1-624x368.png

A full GP100 processor will have 60 shader modules, compared to GM200's 24, although Pascal stores half of the shaders per SM. The GP100 part that is listed in the table above is actually partially disabled, cutting off four of the sixty total. This leads to 3584 single-precision (32-bit) CUDA cores, which is up from 3072 in GM200. (The full GP100 architecture will have 3840 of these FP32 CUDA cores -- but we don't know when or where we'll see that.) The base clock is also significantly higher than Maxwell, 1328 MHz versus ~1000 MHz for the Titan X and 980 Ti, although Ryan has overclocked those GPUs to ~1390 MHz with relative ease. This is interesting, because even though 10.6 TeraFLOPs is amazing, it's only about 20% more than what GM200 could pull off with an overclock.

Continue reading our preview of the NVIDIA Pascal architecture!!

The Status of Windows Phone

Subject: Mobile, Shows and Expos | March 31, 2016 - 01:52 PM |
Tagged: BUILD, build 2016, microsoft, windows 10, windows phone

If you watched the opening keynote of Microsoft's Build conference, then you probably didn't see much Windows Phone (unless you were looking at your own). The Verge talked to Terry Myerson about this, and Microsoft confirmed that they are leading with non-Windows, 4-inch devices, and they want to “generate developer interest” on those platforms for this year.

PC World interpreted this conversation to say that Windows Phone is put on hold.

microsoft-2016-win10devices.jpg

That might be a little hasty, though. Microsoft is still building Windows 10 for Mobile. In fact, since Microsoft updated “Windows OneCore” and jumped build to 14xxx-level build numbers with Windows 10 build 14251, Windows 10 Mobile and Windows 10 PC are kept in lockstep. As far as I know, that is still the plan, and Windows Insiders should continue to receive these on compatible devices.

That said, Microsoft has basically admitted that Windows Phone would just be a distraction for developers this year. At the very least, they don't believe that the platform will be ready for them until next year's Build conference, which means that consumers will probably be even further down than that because there would be no applications for them. Yes, Windows Phone could be slowly shimmying out of the spotlight, but it could also be delayed until they make a good impression, and have the PC, Xbox, Hololens, and other ecosystems secure to lift it up.

Source: The Verge

Microsoft's Phil Spencer Discusses UWP Concerns at Build

Subject: General Tech, Shows and Expos | March 30, 2016 - 01:14 PM |
Tagged: windows 10, uwp, microsoft, build 2016, BUILD

When a platform vendor puts up restrictions, it can be scary, and with good cause. Microsoft's Universal Windows Platform (UWP) is the successor of WinRT, which, in the Windows 8 era, forced web browsers to be reskins of Internet Explorer, forced developers to get both their software and themselves certified before publishing, and so forth. They still allowed the traditional, more open, Win32 API, but locked them into “the Desktop App”.

Naturally, UWP carries similar concerns, which some developers (like Tim Sweeney of Epic Games) voiced publicly. It's more permissive, but in a brittle way. We don't want Microsoft, or someone like a government who has authority over them, to flip a switch and prevent individuals from developing software, ban content that some stakeholder finds offensive (like art with LGBT characters in Russia, the Middle East, or even North America), or ban entire categories of software like encryption suites or third-party web browsers.

windows_8_logo-redux2.png

This is where we get to today's announcement.

Microsoft's Phil Spencer, essentially responding to Tim Sweeney's concerns, and the PC gaming community at large, announced changes to UWP to make it more open. I haven't had too much time to think about it, and some necessary details don't translate well to a keynote segment, but we'll relay what we know. First, they plan to open up VSync off, FreeSync, and G-Sync in May. I find this kind-of odd, since Windows 10 will not receive its significant update (“Anniversary Update”) until July, I'm not sure how they would deliver this. It seems a little big for a simple Windows Update patch. I mean, they have yet to even push new versions of their Edge web browser outside of Windows 10 builds.

The second change is more interesting. Microsoft announced, albeit without dedicating a solid release date or window, to allow modding and overlays in UWP applications. This means that software will be able to, somehow, enter into UWP's process, and users will be encouraged to, somehow, access the file system of UWP applications. Currently, you need to jump through severe hoops to access the contents of Windows Store applications.

They still did not address the issue of side-loading and developing software without a certificate. Granted, you can do both of those things in Windows 10, but in a way that seems like it could be easily removed in a future build, if UWP has enough momentum and whoever runs Microsoft at the time decides to. Remember, this would not be an insidious choice by malicious people. UWP is alluring to Microsoft because it could change the “Windows gets viruses” stigma that is associated with PCs. The problem is that it can be abused, or even unintentionally harm creators and potential users.

On the other hand, they are correcting some major issues. I'm just voicing concerns.

Source: Microsoft

Meet the new Intel Skulltrail NUC; Changing the Game

Subject: Shows and Expos | March 16, 2016 - 09:00 PM |
Tagged: skulltrail, Skull Canyon, nuc, Intel, GDC

board.jpg

No we are not talking about the motherboard from 2008 which was going to compete with AMD's QuadFX platform and worked out just as well.  We are talking about a brand new Skull Canyon NUC powered by an i7-6770HQ with Iris Pro 580 graphics and up to 32GB of DDR4-2133.  The NUC NUC6i7KYK will also be the first system we have seen with a fully capable USB Type-C port, it will offer Thunderbolt 3, USB 3.1 and DisplayPort 1.2 connectivity; not simultaneously but the flexibility is nothing less than impressive.  It will also sport a full-size HDMI 2.0 port and Mini DisplayPort 1.2 outputs so you can still send video while using the Type C port for data transfer.  The port will also support external graphics card enclosures if you plan on using this as a gaming machine as well.

SkullCanyon-NUC-extremeangle1-wht.jpg

The internal storage subsystem is equally impressive, dual M.2 slots will give you great performance, the SD card slot not so much but still a handy feature.  Connectivity is supplied by Intel Dual Band Wireless-AC 8260 (802.11 ac) and Bluetooth 4.2 and an infrared sensor will let you use your favourite remote control if you set up the Skulltrail NUC as a media server.  All of these features are in a device less than 0.7 litres in size, with your choice of two covers and support for your own if you desire to personalize your system.  The price is not unreasonable, the MSRP for a barebones system is $650, one with 16GB memory, 256GB SSD and Windows 10 should retail for about $1000.  You can expect to see these for sale on NewEgg in April to ship in May.

All this and more can be found on Intel's news room, and you can click here for the full system specs.

Source: Intel

Shedding a little light on Monday's announcement

Most of our readers should have some familiarity with GameWorks, which is a series of libraries and utilities that help game developers (and others) create software. While many hardware and platform vendors provide samples and frameworks, taking the brunt of the work required to solve complex problems, this is NVIDIA's branding for their suite of technologies. Their hope is that it pushes the industry forward, which in turn drives GPU sales as users see the benefits of upgrading.

nvidia-2016-gdc-gameworksmission.png

This release, GameWorks SDK 3.1, contains three complete features and two “beta” ones. We will start with the first three, each of which target a portion of the lighting and shadowing problem. The last two, which we will discuss at the end, are the experimental ones and fall under the blanket of physics and visual effects.

nvidia-2016-gdc-volumetriclighting-fallout.png

The first technology is Volumetric Lighting, which simulates the way light scatters off dust in the atmosphere. Game developers have been approximating this effect for a long time. In fact, I remember a particular section of Resident Evil 4 where you walk down a dim hallway that has light rays spilling in from the windows. Gamecube-era graphics could only do so much, though, and certain camera positions show that the effect was just a translucent, one-sided, decorative plane. It was a cheat that was hand-placed by a clever artist.

nvidia-2016-gdc-volumetriclighting-shaftswireframe.png

GameWorks' Volumetric Lighting goes after the same effect, but with a much different implementation. It looks at the generated shadow maps and, using hardware tessellation, extrudes geometry from the unshadowed portions toward the light. These little bits of geometry sum, depending on how deep the volume is, which translates into the required highlight. Also, since it's hardware tessellated, it probably has a smaller impact on performance because the GPU only needs to store enough information to generate the geometry, not store (and update) the geometry data for all possible light shafts themselves -- and it needs to store those shadow maps anyway.

nvidia-2016-gdc-volumetriclighting-shaftsfinal.png

Even though it seemed like this effect was independent of render method, since it basically just adds geometry to the scene, I asked whether it was locked to deferred rendering methods. NVIDIA said that it should be unrelated, as I suspected, which is good for VR. Forward rendering is easier to anti-alias, which makes the uneven pixel distribution (after lens distortion) appear more smooth.

Read on to see the other four technologies, and a little announcement about source access.

MWC 16: Imagination Technologies Ray Tracing Accelerator

Subject: Graphics Cards, Mobile, Shows and Expos | February 23, 2016 - 08:46 PM |
Tagged: raytracing, ray tracing, PowerVR, mwc 16, MWC, Imagination Technologies

For the last couple of years, Imagination Technologies has been pushing hardware-accelerated ray tracing. One of the major problems in computer graphics is knowing what geometry and material corresponds to a specific pixel on the screen. Several methods exists, although typical GPUs crush a 3D scene into the virtual camera's 2D space and do a point-in-triangle test on it. Once they know where in the triangle the pixel is, if it is in the triangle, it can be colored by a pixel shader.

imagtech-2016-PowerVR-GR6500-GPU-PowerVR-Wizard-GPUs.png

Another method is casting light rays into the scene, and assigning a color based on the material that it lands on. This is ray tracing, and it has a few advantages. First, it is much easier to handle reflections, transparency, shadows, and other effects where information is required beyond what the affected geometry and its material provides. There are usually ways around this, without resorting to ray tracing, but they each have their own trade-offs. Second, it can be more efficient for certain data sets. Rasterization, since it's based around a “where in a triangle is this point” algorithm, needs geometry to be made up of polygons.

It also has the appeal of being what the real world sort-of does (assuming we don't need to model Gaussian beams). That doesn't necessarily mean anything, though.

At Mobile World Congress, Imagination Technologies once again showed off their ray tracing hardware, embodied in the PowerVR GR6500 GPU. This graphics processor has dedicated circuitry to calculate rays, and they use it in a couple of different ways. They presented several demos that modified Unity 5 to take advantage of their ray tracing hardware. One particularly interesting one was their quick, seven second video that added ray traced reflections atop an otherwise rasterized scene. It was a little too smooth, creating reflections that were too glossy, but that could probably be downplayed in the material ((Update: Feb 24th @ 5pm Car paint is actually that glossy. It's a different issue). Back when I was working on a GPU-accelerated software renderer, before Mantle, Vulkan, and DirectX 12, I was hoping to use OpenCL-based ray traced highlights on idle GPUs, if I didn't have any other purposes for it. Now though, those can be exposed to graphics APIs directly, so they might not be so idle.

The downside of dedicated ray tracing hardware is that, well, the die area could have been used for something else. Extra shaders, for compute, vertex, and material effects, might be more useful in the real world... or maybe not. Add in the fact that fixed-function circuitry already exists for rasterization, and it makes you balance gain for cost.

It could be cool, but it has its trade-offs, like anything else.

MWC 16: LG G5 Hands-on. Performance and Modularity

Subject: Mobile, Shows and Expos | February 22, 2016 - 05:09 AM |
Tagged: video, snapdragon 820, snapdragon, qualcomm, MWC 2016, MWC, LG, G5

The new LG G5 flagship smartphone offers a unique combination of form factor, performance and modularity that no previous smartphone design has had. But will you want to buy in?

2016-02-22 09.15.37.jpg

I had a feeling that the Snapdragon 820 SoC from Qualcomm would make an impression at Mobile World Congress this year and it appears the company has improved on the previous flagship processor quite a bit. Both Samsung and LG have implemented it into the 2016 models, including the new G5, offering up a combination of performance and power efficiency that is dramatically better than the 810 that was hindered by heat and process technology concerns.

Along with the new processor, the G5 includes 4GB of RAM, 32GB of on-board storage with micro SD expansion, a 2,800 mAh battery and Android 6.0 out of the box. The display is 5.3-in and uses LG IPS technology with a 2560x1440 resolution, resulting in an impressive 554 PPI. LG has updated the USB connection to Type-C, a move that Samsung brushed off as unnecessary at this time.

The phones design is pretty standard and will look very familiar to anyone that has handled a G4 or similar flagship smartphone in recent months. It was bigger in the hand than the iPhone 6s but considering the panel size differences, it was more compact than expected.

2016-02-22 09.15.40.jpg

Modularity is the truly unique addition to the G5 though. The battery is replaceable by sliding out a bottom portion of the phone, released with a tab on the left side. This allows LG to maintain the metal body construction but still offer flexibility for power users that are used to having extra batteries in their bag. This mechanism also means LG can offer add-on modules for the phone.

2016-02-22 09.05.04.jpg

The first two available will be the LG Cam Plus and the LG Hi-Fi Plus. The Cam Plus gives the phone a camera grip as well as dedicated buttons for the shutter, video recording and zoom. Including an extra 1,200 mAh of battery is a nice touch too. The Hi-Fi Plus module has a DAC and headphone amplifier enbeded in it and can also be used connected to a PC through the USB Type-C connection; a nice touch.

2016-02-22 09.13.48.jpg

I was overall pretty impressed with what LG had to offer with the G5. Whether or not the modular design gains any traction will have to be seen; I have concerns over the public's desire to carry around modules or affect the form factor of their phones so dramatically.