Subject: Graphics Cards | March 2, 2015 - 02:31 PM | Ryan Shrout
Tagged: sdk, Mantle, dx12, API, amd
The Game Developers Conference is San Francisco starts today and you can expect to see more information about DirectX 12 than you could ever possibly want, so be prepared. But what about the original low-level API, AMD Mantle. Utilized in Battlefield 4, Thief and integrated into the Crytek engine (announced last year), announced with the release of the Radeon R9 290X/290, Mantle was truly the instigator that pushed Microsoft into moving DX12's development along at a faster pace.
Since DX12's announcement, AMD has claimed that Mantle would live on, bringing performance advantages to AMD GPUs and would act as the sounding board for new API features for AMD and game development partners. And, as was always trumpeted since the very beginning of Mantle, it would become an open API, available for all once it outgrew the beta phase that it (still) resides in.
Something might have changed there.
A post over on the AMD Gaming blog from Robert Hallock has some news about Mantle to share as GDC begins. First, the good news:
AMD is a company that fundamentally believes in technologies unfettered by restrictive contracts, licensing fees, vendor lock-ins or other arbitrary hurdles to solving the big challenges in graphics and computing. Mantle was destined to follow suit, and it does so today as we proudly announce that the 450-page programming guide and API reference for Mantle will be available this month (March, 2015) at www.amd.com/mantle.
This documentation will provide developers with a detailed look at the capabilities we’ve implemented and the design decisions we made, and we hope it will stimulate more discussion that leads to even better graphics API standards in the months and years ahead.
That's great! We will finally be able to read about the API and how it functions, getting access to the detailed information we have wanted from the beginning. But then there is this portion:
AMD’s game development partners have similarly started to shift their focus, so it follows that 2015 will be a transitional year for Mantle. Our loyal customers are naturally curious what this transition might entail, and we wanted to share some thoughts with you on where we will be taking Mantle next:
AMD will continue to support our trusted partners that have committed to Mantle in future projects, like Battlefield™ Hardline, with all the resources at our disposal.
- Mantle’s definition of “open” must widen. It already has, in fact. This vital effort has replaced our intention to release a public Mantle SDK, and you will learn the facts on Thursday, March 5 at GDC 2015.
- Mantle must take on new capabilities and evolve beyond mastery of the draw call. It will continue to serve AMD as a graphics innovation platform available to select partners with custom needs.
- The Mantle SDK also remains available to partners who register in this co-development and evaluation program. However, if you are a developer interested in Mantle "1.0" functionality, we suggest that you focus your attention on DirectX® 12 or GLnext.
Essentially, AMD's Mantle API in it's "1.0" form is at the end of its life, only supported for current partners and the publicly available SDK will never be posted. Honestly, at this point, this isn't so much of a let down as it is a necessity. DX12 and GLnext have already superseded Mantle in terms of market share and mind share with developers and any more work AMD put into getting devs on-board with Mantle is wasted effort.
Battlefield 4 is likely to be the only major title to use AMD Mantle
AMD claims to have future plans for Mantle though it will continue to be available only to select partners with "custom needs." I would imagine this would expand outside the world games but could also mean game consoles could be the target, where developers are only concerned with AMD GPU hardware.
So - from our perspective, Mantle as we know is pretty much gone. It served its purpose, making NVIDIA and Microsoft pay attention to the CPU bottlenecks in DX11, but it appears the dream was a bit bigger than the product could become. AMD shouldn't be chastised because of this shift nor for its lofty goals that we kind-of-always knew were too steep a hill to climb. Just revel in the news that pours from GDC this week about DX12.
Project Lead: Joris-Jan van ‘t Land
Thanks to Ian Comings, guest writer from the PC Perspective Forums who conducted the interview of Bohemia Interactive's Joris-Jan van ‘t Land. If you are interested in learning more about ArmA 3 and hanging out with some PC gamers to play it, check out the PC Perspective Gaming Forum!
I recently got the chance to send some questions to Bohemia Interactive, a computer game development company based out of Prague, Czech Republic, and a member of IDEA Games. Bohemia Interactive was founded in 1999 by CEO Marek Španěl, and it is best known for PC gaming gems like Operation Flashpoint: Cold War Crisis, The ArmA series, Take On Helicopters, and DayZ. The questions are answered by ArmA 3's Project Lead: Joris-Jan van ‘t Land.
PC Perspective: How long have you been at Bohemia Interactive?
VAN ‘T LAND: All in all, about 14 years now.
PC Perspective: What inspired you to become a Project Lead at Bohemia Interactive?
VAN ‘T LAND: During high school, it was pretty clear to me that I wanted to work in game development, and just before graduation, a friend and I saw a first preview for Operation Flashpoint: Cold War Crisis in a magazine. It immediately looked amazing to us; we were drawn to the freedom and diversity it promised and the military theme. After helping run a fan website (Operation Flashpoint Network) for a while, I started to assist with part-time external design work on the game (scripting and scenario editing). From that point, I basically grew naturally into this role at Bohemia Interactive.
PC Perspective: What part of working at Bohemia Interactive do you find most satisfying? What do you find most challenging?
VAN ‘T LAND: The amount of freedom and autonomy is very satisfying. If you can demonstrate skills in some area, you're welcome to come up with random ideas and roll with them. Some of those ideas can result in official releases, such as Arma 3 Zeus. Another rewarding aspect is the near real-time connection to those people who are playing the game. Our daily Dev-Branch release means the work I do on Monday is live on Tuesday. Our own ambitions, on the other hand, can sometimes result in some challenges. We want to do a lot and incorporate every aspect of combat in Arma, but we're still a relatively small team. This can mean we bite off more than we can deliver at an acceptable level of quality.
PC Perspective: What are some of the problems that have plagued your team, and how have they been overcome?
VAN ‘T LAND: One key problem for us was that we had no real experience with developing a game in more than one physical location. For Arma 3, our team was split over two main offices, which caused quite a few headaches in terms of communication and data synchronization. We've since had more key team members travel between the offices more frequently and improved our various virtual communication methods. A lot of work has been done to try to ensure that both offices have the latest version of the game at any given time. That is not always easy when your bandwidth is limited and games are getting bigger and bigger.
Subject: Graphics Cards, Mobile | February 26, 2015 - 02:15 PM | Ryan Shrout
Tagged: super-gpu, PowerVR, Imagination Technologies, gt7900
As a preview to announcements and releases being made at both Mobile World Congress (MWC) and the Game Developers Summit (GDC) next week, Imagination Technologies took the wraps off of a new graphics product they are calling a "super-GPU". The PowerVR GT7900 is the new flagship GPU as a part of its Series7XT family that is targeting a growing category called "affordable game consoles." Think about the Android-powered set-top devices like the Ouya or maybe Amazon's Kindle TV.
PowerVR breaks up its GPU designs into unified shading clusters (USCs) and the GT7900 has 16 of them for a total of 512 ALU cores. Imagination has previously posted a great overview of its USC architecture design and how you can compare its designs to other GPUs on the market. Imagination wants to claim that the GT7900 will offer "PC-class gaming experiences" though that is as ambiguous as the idea of a work load of a "console-level game." But with rated peak performance levels hitting over 800 GFLOPS in FP32 and 1.6 TFLOPS in FP16 (half-precision) this GPU does have significant theoretical capability.
|PowerVR GT7900||Tegra X1|
|GPU Clock||800 MHz||1000 MHz|
|Process Tech||16nm FinFET+||20nm TSMC|
Imagination also believes that PowerVR offers a larger portion of its peak performance for a longer period of time than the competition thanks to the tile-based deferred rendering (TBDR) approach that has been "refined over the years to deliver unmatched efficiency."
The FP16 performance number listed above is useful as an extreme power savings option where the half-precision compute operates in a much more efficient manner. A fair concern is how many applications, GPGPU or gaming, actually utilize the FP16 data type but having support for it in the GT7900 allows developers to target it.
Other key features of the GT7900 include support for OpenGL ES 3.1 + AEP (Android Extension Pack), hardware tessellation and ASTC LDR and HDR texture compression standards. The GPU also can run in a multi-domain virtualization mode that would allow multiple operating systems to run in parallel on a single platform.
Imagination believes that this generation of PowerVR will "usher a new era of console-like gaming experiences" and will showcase a new demo at GDC called Dwarf Hall.
I'll be at GDC next week and have already setup a meeting with Imagination to talk about the GT7900 so I can have some hands on experiences to report back with soon. I am continually curious about the market for these types of high-end "mobile" GPUs with the limited market that the Android console market currently addresses. Imagination does claim that the GT7900 is beating products with performance levels as high as the GeForce GT 730M discrete GPU - no small feat.
Subject: General Tech | February 26, 2015 - 07:00 AM | Scott Michaud
Tagged: windows 10, windows, microsoft
WZor, a group in Russia that somehow acquires many Windows leaks, has just published screenshots of Windows 10 Build 10022 and Windows Server Build 9926. As far as we can tell, not much has changed. We see neither an upgraded Cortana nor a look at the Spartan browser. The build is not labeled “Microsoft Confidential” though, which makes people believe that it is (or was) intended for public release -- maybe as early as this week.
Image Credit: WZor Twitter
Honestly, I do not see anything different from the provided screenshots apart from the incremented version number. It is possible that this build addresses back-end issues, leaving the major new features for BUILD in late April. Leaked notes (also by WZor) for build 10014, called an “Early Partner Drop”, suggest that version was designed for hardware and software vendors. Perhaps the upcoming preview build is designed to give a platform for third-parties to develop updates ahead of Microsoft releasing the next (or second-next) big build?
Either way, it seems like we will get it very soon.
Subject: General Tech | February 26, 2015 - 02:07 PM | Ken Addison
Tagged: pcper, podcast, video, usb 3.1, Broadwell, Intel, nuc, Samsung, 840 evo, asus, Strix Tactic Pro, GTX 970, directx12, dx12
PC Perspective Podcast #338 - 02/26/2015
Join us this week as we discuss more USB 3.1 Devices, Broadwell NUC, another 840 Evo fix and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Allyn Malventano, and Sebastian Peak
Program length: 1:46:04
EVGA Contest Winner!
Week in Review:
News item of interest:
Question: Alex from Sydney
Just a quick question regarding DirectX 12. I’m planning to buy a new graphics card soon but I want a DirectX 12 card for all the fancy new features so I’m considering either the GTX 970 or 980, the question I have is are these real DirectX 12 cards? Since DirectX 12 development is still ongoing how can these cards be fully DirectX 12 complaint?
Hardware/Software Picks of the Week:
Subject: General Tech | February 26, 2015 - 02:02 AM | Tim Verry
Tagged: SoFIA, moorefield, Intel, Cherry Trail, branding, atom
Intel is updating its Atom processor branding to better communicate the expected performance and experience customers can expect from their Intel powered mobile device. In fact, the new branding specifies three tiers. Atom processors will soon come in Atom x3, x5, and x7 flavors. This branding scheme is similar to the Core processor branding using the i3, i5, and i7 labels.
The Atom x3, x5, and x7 chips are low power, efficient processors for battery powered devices and sit below the Core M series which in turn are below the Core i3, i5, and i7 processors. The following infographic shows off the new branding though Intel does not reveal any specific details about these new Atom chips (we will hopefully know more after Mobile World Congress). Of course, Atom x3 chips will reside in smartphones with x5 and x7 chips powering tablets and budget convertibles. The x7 brand represents the flagship processors of the Atom line.
The new branding will begin with the next generation of Atom chips which should include Cherry Trail, the 14nm successor to Bay Trail featuring four x86 Airmont cores and Gen 8 Intel graphics. Cherry Trail (Cherryview SoC) will be used in all manner of mobile devices from entry level 8"+ tablets to larger notebooks and convertibles. It appears that Intel will use Moorefield (a quad core 14nm refresh of Merrifield) through 2015 for smartphones though road maps seem to indicate that Intel's budget SoFIA SoC will also launch this year. SoFIA and Moorefield processors should fall under the Atom x3 brand with the higher powered and higher clocked Cherry Trail chips will use the Atom x5 and x7 monikers.
What are your thoughts on Intel's new Atom x3/x5/x7 brands?
Subject: Graphics Cards | March 1, 2015 - 07:30 AM | Scott Michaud
Tagged: superfish, Lenovo, bloatware, adware
Obviously, this does not forget the controversy that Lenovo got themselves into, but it is certainly the correct response (if they act how they imply). Adware and bloatware is common to find on consumer PCs, which makes the slowest of devices even more sluggish as demos and sometimes straight-up advertisements claim their share of your resources. This does not even begin to discuss the security issues that some of these hitchhikers drag in. Again, I refer you to the aforementioned controversy.
In response, albeit a delayed one, Lenovo has announced that, by the launch of Windows 10, they will only pre-install the OS and “related software”. Lenovo classifies this related software as drivers, security software, Lenovo applications, and applications for “unique hardware” (ex: software for an embedded 3D camera).
It looks to be a great step, but I need to call out “security software”. Windows 10 should ship with Microsoft's security applications in many regions, which really questions why a laptop provider would include an alternative. If the problem is that people expect McAfee or Symantec, then advertise pre-loaded Microsoft anti-malware and keep it clean. Otherwise, it feels like keeping a single finger in the adware take-a-penny dish.
At least it is not as bad as trying to install McAfee every time you update Flash Player. I consider Adobe's tactic the greater of two evils on that one. I mean, unless Adobe just thinks that Flash Player is so insecure that you would be crazy to install it without a metaphorical guard watching over your shoulder.
And then of course we reach the divide between “saying” and “doing”. We will need to see Lenovo's actual Windows 10 devices to find out if they kept their word, and followed its implications to a tee.
Subject: General Tech | February 25, 2015 - 12:36 PM | Jeremy Hellstrom
Tagged: fud, Comodo, SSL, security, PrivDog, idiots
This has been a bad week for the secure socket layer and the news just keeps getting worse. Comodo provides around one out of every three SSL certs currently in use as they have, until now, had a stirling reputation and were a trusted provider. It turns out that this reputation may not be deserved seeing as how their Internet Security 2014 product ships with an application called Adtrustmedia PrivDog, which is enabled by default. Not only does this app install a custom root CA certificate which intercepts connections to websites to be able to insert customized ads like SuperFish does it can also turn invalid HTTPS certificates into valid ones. That means that an attacker can use PrivDog to spoof your banks SSL cert, redirect you to a fake page and grab your credentials, while all the time your browser reports a valid and secure connection to the site.
The only good news from The Register's article is that this specific vulnerability is only present in PrivDog versions 188.8.131.52 and 184.108.40.206 and so has limited distribution. The fact that this indicates the entire SSL certificate model is broken and even those who create the certs to assure your security feel that inserting a man in the middle attack into their software does not contravene their entire reason for existing is incredibly depressing.
"The US Department of Homeland Security's cyber-cops have slapped down PrivDog, an SSL tampering tool backed by, er, SSL certificate flogger Comodo.
Comodo, a global SSL authority, boasts a third of the HTTPS cert market, and is already in hot water for shipping PrivDog."
Here is some more Tech News from around the web:
- AMD previews Carrizo APU, offers insights into power savings @ The Tech Report
- Amazon tries to patent 3D printers on trucks @ The Register
- Mozilla Firefox 36 is second major browser to bring HTTP/2 @ The Inquirer
- Samb-AAAHH! Scary remote execution vuln spotted in Windows-Linux interop code @ The Register
- JEDEC publishes eMMC 5.1 standard @ DigiTimes
- Red Hat: Traditional virtualisation isn't going anywhere @ The Inquirer
Subject: Mobile | March 1, 2015 - 02:01 PM | Sebastian Peak
Tagged: SoC, smartphones, Samsung, MWC 2015, MWC, Galaxy S6 Edge, galaxy s6, Exynos 7420, 14nm
Samsung has announced the new Galaxy S phones at MWC, and the new S6 and S6 Edge should be in line with what you were expecting if you’ve followed recent rumors.
The new Samsung Galaxy S6 and S6 Edge (Image credit: Android Central)
As expected we no longer see a Qualcomm SoC powering the new phones, and as the rumors had indicated Samsung opted instead for their own Exynos 7 Octa mobile AP. The Exynos SoC’s have previously been in international versions of Samsung’s mobile devices, but they have apparently ramped up production to meet the demands of the US market as well. There is an interesting twist here, however.
The Exynos 7420 powering both the Galaxy S6 and S6 Edge is an 8-core SoC with ARM’s big.LITTLE design, combining four ARM Cortex-A57 cores and four Cortex-A53 cores. Having announced 14nm FinFET mobile AP production earlier in February the possibility of the S6 launching with this new part was interesting, as the current process tech is 20nm HKMG for the Exynos 7. However a switch to this new process so soon before the official announcement seemed unlikely as large-scale 14nm FinFET production was just unveiled on February 16. Regardless, AnandTech is reporting that the new part will indeed be produced using this new 14nm process technology, and this gives Samsung an industry-first for a mobile SoC with the launch of the S6/S6 Edge.
GSM Arena has specs of the Galaxy S6 posted, and here’s a brief overview:
- Display: 5.1” Super AMOLED, QHD resolution (1440 x 2560, ~577 ppi), Gorilla Glass 4
- OS: Android OS, v5.0 (Lollipop) - TouchWiz UI
- Chipset: Exynos 7420
- CPU: Quad-core 1.5 GHz Cortex-A53 & Quad-core 2.1 GHz Cortex-A57
- GPU: Mali-T760
- Storage/RAM: 32/64/128 GB, 3 GB RAM
- Camera: (Primary) 16 MP, 3456 x 4608, optical image stabilization, autofocus, LED flash
- Battery: 2550 mAh (non-removable)
The new phones both feature attractive styling with metal and glass construction and Gorilla Glass 4 sandwiching the frame, giving each phone a glass back.
The back of the new Galaxy S6 (Image credit: Android Central)
The guys at Android Central (source) had some pre-release time with the phones and have a full preview and hands-on video up on their site. The new phones will be released worldwide on April 10, and no specifics on pricing have been announced.
Introduction and First Impressions
The RV05 is the current iteration of SilverStone's Raven enclosure series, and a reinvention of their ATX enthusiast design with a revised layout that eliminates 5.25" drive bays for a smaller footprint.
Return to Form
The fifth edition of SilverStone's Raven is a return to form of sorts, as it owes more to the design of the original RV01 than the next three to follow. The exterior again has an aggressive, angular look with the entire enclosure sitting up slightly at the rear and tilted forward. Though the overall effect is likely less visually exciting than the original, depending on taste, in its simplicity the design feels more refined and modern than the RV01. Some of the sharpest angles have been eliminated or softened, though the squat stance coupled with its smaller size gives the RV05 an energetic appearance - as if it's ready to strike. (OK, I know it's just a computer case, but still...)
The Raven series is important to the case market as a pioneer of the 90º motherboard layout for ATX systems, expanding on the design originally developed by Intel for the short-lived BTX form-factor. In the layout implemented in the Raven series the motherboard is installed with the back IO panel facing up, which requires the graphics card to be installed vertically. This vertical orientation assists with heat removal by exploiting the tendency of warm air to rise, and when implemented in an enclosure like the RV05 it can create an excellent thermal environment for your components. The RV05 features large fans at the bottom of the case that push air upward and across the components on the motherboard, forcing warm air to exit through a well-ventilated top panel.
And the RV05 isn't just a working example of an interesting thermal profile, it's actually a really cool-looking enclosure with some premium features and suprisingly low price for a product like this at $129 on Amazon as this was written. In our review of the RV05 we'll be taking a close look at the case and build process, and of course we'll test the thermal performance with some CPU and GPU workloads to find out just how well this design performs.
Introduction and Technical Specifications
Courtesy of Cooler Master
Cooler Master is known in the enthusiast community for their innovative designs with product offerings ranging from cases to desktop and laptop cooling implements. Cooler Master also offers their own line of all-in-one (AIO) CPU liquid cooling solutions for better system performance without the noise of a typical air cooler. With their Nepton 240M cooler, they enhanced the existing design of their previous AIO products, optimizing its performance with an enhanced pump and radiator design. We measured the unit's performance against that of other high-performance liquid and air coolers to best illustrate its abilities. The Nepton 240M's premium performance comes with a premium price, at a $139.99 MSRP.
Courtesy of Cooler Master
Courtesy of Cooler Master
The Nepton 240M AIO liquid cooler features a 240mm aluminum-finned radiator tied to a base unit consisting of a 120 liter per minute pump and a micro-finned copper base plate. Unlike the Glacer model, the Nepton 240M does not feature the ability to drain and refill the unit. Cooler Master designed the Nepton 240M with a 27mm deep, 2x120mm copper radiator with brass internal channels, bundled with two of its 120mm Silencio model fans. The Silencio fans are optimized for low noise and high pressure, perfect for use with a liquid cooling radiator. The radiator and unit base are connected by ribbed FEP (Fluorinated Ethylene Propylene) tubing, allowing for high flexibility without the worry of tube kinking.
Subject: Mobile | February 25, 2015 - 04:46 PM | Jeremy Hellstrom
Tagged: z3580, venue 8 7000, venue, tablet, silvermont, moorefield, Intel, dell, atom z3580, Android
Dell's Venue 8 7000 tablet sports an 8.4" 2560x1600 OLED display and is powered by the Moorefield based Atom Z3580 SOC, 2GB LPDDR3-1600 with 16GB internal of internal storage with up to a 512GB Micro SD card supported. Even more impressive is that The Tech Report had no issues installing apps or moving files to the SD card with ES File Explorer, unlike many Android devices that need certain programs to reside on the internal storage media. Like Ryan, they had a lot of fun with the RealSense Camera and are looking forward to the upgrade to Lollipop support. Check out The Tech Report's opinion of this impressive Android tablet right here.
"Dell's Venue 8 7000 is the thinnest tablet around, and that's not even the most exciting thing about it. This premium Android slate packs a Moorefield-based Atom processor with quad x86 cores, a RealSense camera that embeds 3D depth data into still images, and a staggeringly beautiful OLED display that steals the show. Read on for our take on a truly compelling tablet."
Here are some more Mobile articles from around the web:
- Lenovo ThinkPad X1 Carbon Works Great As A Linux Ultrabook @ Phoronix
- Cooler Master NotePal ERGOSTAND III Review @ Techgage
- Portable Smartphone Battery Pack Roundup @ eTeknix
- Sandberg Outdoor Powerbank 10400 mAh Review @ NikKTech
- Xiaomi Mi4 64GB Smartphone Review @ Madshrimps
Subject: General Tech | February 25, 2015 - 08:56 PM | Tim Verry
Tagged: PowerVR, Intel, Imagination Technologies, igp, finance
Update: Currency exchange rates have been corrected. I'm sorry for any confusion!
Intel Foundation is selling off its remaining stake in UK-based Imagination Technologies (IMG.LN). According to JP Morgan, Intel is selling off 13.4 million shares (4.9% of Imagination Technologies) for 245 GBp each. Once all shares are sold, Intel will gross just north of $50.57 Million USD.
Imagination Technologies' PowerVR Rogue Series 6XT GPU is used in Apple's A8-series chips.
Intel first invested in Imagination Technologies back in October of 2006 in a deal to gain access to the company’s PowerVR graphics IP portfolio. Since then, Intel has been slowly moving away from PowerVR graphics in favor of it’s own internal HD graphics GPUs. (Further, Intel sold off 10% of its IMG.LN stake in June of last year.) Even Intel’s low cost Atom line of SoCs has mostly moved to Intel GPUs with the exception of the mobile Merrifield and Moorefield” smartphone/tablet SoCs.
The expansion of Intel’s own graphics IP combined with Imagination Technologies acquisition of MIPS are reportedly the “inevitable” reasons for the sale. According to The Guardian, industry analysts have speculated that, as it stands, Intel is a minor customer of Imagination Technologies at less than 5% for graphics (a licensing agreement signed this year doesn’t rule out PowerVR graphics permanently despite the sale). Imagination Technologies still has a decent presence in the mobile (ARM-based) space with customers including Apple, MediaTek, Rockchip, Freescale, and Texas Instruments.
Currently, the company’s stock price is sitting at 258.75 GBp (~$3.99 USD) which seems to indicate that the Intel sell off news was “inevitable” and was already priced in or simply does not have investors that concerned.
What do you think about the sale? Where does this leave Intel as far as graphics goes? Will we see Intel HD Graphics scale down to smartphones or will the company go with a PowerVR competitor? Would Intel really work with ARM’s Mali, Qualcomm’s Adreno, or Samsung’s rumored custom GPU cores? On that note, an Intel powered smartphone with NVIDIA Tegra graphics would be amazing (hint, hint Intel!)
Quiet, Efficient Gaming
The last few weeks have been dominated by talk about the memory controller of the Maxwell based GTX 970. There are some very strong opinions about that particular issue, and certainly NVIDIA was remiss on actually informing consumers about how it handles the memory functionality of that particular product. While that debate rages, we have somewhat lost track of other products in the Maxwell range. The GTX 960 was released during this particular firestorm and, while it also shared the outstanding power/performance qualities of the Maxwell architecture, it is considered a little overpriced when compared to other cards in its price class in terms of performance.
It is easy to forget that the original Maxwell based product to hit shelves was the GTX 750 series of cards. They were released a year ago to some very interesting reviews. The board is one of the first mainstream cards in recent memory to have a power draw that is under 75 watts, but can still play games with good quality settings at 1080P resolutions. Ryan covered this very well and it turned out to be a perfect gaming card for many pre-built systems that do not have extra power connectors (or a power supply that can support 125+ watt graphics cards). These are relatively inexpensive cards and very easy to install, producing a big jump in performance as compared to the integrated graphics components of modern CPUs and APUs.
The GTX 750 and GTX 750 Ti have proven to be popular cards due to their overall price, performance, and extremely low power consumption. They also tend to produce a relatively low amount of heat, due to solid cooling combined with that low power consumption. The Maxwell architecture has also introduced some new features, but the major changes are to the overall design of the architecture as compared to Kepler. Instead of 192 cores per SMK, there are now 128 cores per SMM. NVIDIA has done a lot of work to improve performance per core as well as lower power in a fairly dramatic way. An interesting side effect is that the CPU hit with Maxwell is a couple of percentage points higher than Kepler. NVIDIA does lean a bit more on the CPU to improve overall GPU power, but most of this performance hit is covered up by some really good realtime compiler work in the driver.
Asus has taken the GTX 750 Ti and applied their STRIX design and branding to it. While there are certainly faster GPUs on the market, there are none that exhibit the power characteristics of the GTX 750 Ti. The combination of this GPU and the STRIX design should result in an extremely efficient, cool, and silent card.
Subject: General Tech | March 1, 2015 - 09:11 PM | Scott Michaud
Tagged: quixel, GDC, gdc 15, ue4, unreal engine 4, gdc 2015
You know that a week will be busy when companies start announcing a day or two early to beat the flood. While Game Developers Conference starts tomorrow, Quixel published their Jungle demo to YouTube today in promotion of their MEGASCANS material library. The video was rendered in Unreal Engine 4.
Their other material samples look quite convincing. The vines on a wall (column in this case) is particularly interesting because it even looks like two distinct layers, despite being a single mesh with displacement as far as I can tell. I don't know, maybe it is two or three layers. It would certainly make sense if it was, but the top and bottom suggests that it is single, and that is impressive. It even looks self-occluding.
Pricing and availability for the library is not yet disclosed, but it sounds like it will be a subscription service. The software ranges from $25 to $500, depending on what you get and what sort of license you need (Academic vs Commercial and so forth).
Followers of PC Perspective have likely seen a pair of stories previewing the upcoming performance and features of USB 3.1. First we got our hands on the MSI X99A Gaming 9 ACK motherboard and were able to run through our very first hands-on testing with USB 3.1 hardware. The motherboard had built-in USB 3.1 support and a device that was configured with a RAID-0 of Intel SSD 730 Series drives.
We followed that up with a look at the ASUS USB 3.1 implementation that included a PCIe add-on card and a dual-drive mSATA device also in RAID-0. This configuration was interesting because we can theoretically install this $40 product into any system with a free PCI Express slot.
Performance was astounding for incredibly early implementations, reaching as high as 835 MB/s!
In that last article I theorized that it would be some time before we got our hands on retail USB 3.1 hardware but it appears I wasn't giving the industry enough credit. ASUS passed us a list of incoming devices along with release schedules. There are 27 devices scheduled to be released before the end of April and ~35 by the middle of the year.
It's a daunting table to look at, so be prepared!
The product categories are mostly dominated by the likes of the a USB 3.1 to 2.5-in adapter; that would be useful but you aren't going to top out the performance of the USB 3.1 with a single 2.5-in SATA device. Iomaster has one listed as a "USB 3.1 to MSATA & M2 SSD enclosure" which could be more interesting - does it accept PCI Express M.2 SSDs?
Minerva Innovation has a couple of interesting options, all listed with pairs of mSATA or M.2 ports, two with Type-C connections. What we don't know based on this data is if it supports PCIe M.2 SSDs or SATA only and if it supports RAID-0.
A couple more list dual SATA ports which might indicate that we are going to see multiple hard drives / SSDs over a single USB 3.1 connection but without RAID support. That could be another way to utilize the bandwidth of USB 3.1 in a similar way to how we planned to use Thunderbolt daisy chaining.
We don't have pricing yet, but I don't think USB 3.1 accessories will be significantly more expensive than what USB 3.0 devices sell for. So, does this list of accessories make you more excited to upgrade your system for USB 3.1?
Subject: General Tech | February 24, 2015 - 12:56 PM | Jeremy Hellstrom
Tagged: fud, security, smartphone
Tracking your smartphones location via aggregate battery usage is not the most efficient or accurate method but it can be done and Samsung (and others) have not provided a switch which makes that particular data private. Researchers have shown that by tracking the battery drain of the 3G cellular radio on the battery one can determine distance from the cellular base station the phone is connected to and a coarse location based on interference environmental factors such as buildings which partially block the signal. It is only a very coarse locator but does give better information than just the base station the phone is connected to and as we are creatures of habit it allows tracking normal patterns of movement. This is nowhere near as accurate as GPS tracking and does require a bit of work to pull off but as battery usage and levels are sent by the phone in the clear with no method of preventing that it should cause some privacy concerns for users. You can read the research paper (in PDF) by following the link from The Inquirer.
"SCIENTISTS have warned of a new smartphone risk after discovering that battery power can be used to track a person's movements."
Here is some more Tech News from around the web:
- A billion things are already on the IoT: Verizon @ The Inquirer
- ARM and IBM bolster Internet of Things with cloud-based mbed starter kit @ The Inquirer
- May the fourth be with you: Torvalds names next Linux v 4.0 @ The Register
- UK Scientists Claim 1Tbps Data Speed Via Experimental 5G Technology @ Slashdot
- Intel Moving Forward With 10nm, Will Switch Away From Silicon For 7nm @ Slashdot
Subject: General Tech | February 27, 2015 - 04:41 PM | Tim Verry
Tagged: SFF, nuc5i7ryh, nuc, Intel, broadwell-u, Broadwell
We recently reviewed a new small form factor NUC PC from Intel powered by Broadwell. That i5-powered NUC5i5RYK will soon be joined by an even higher end Broadwell NUC (NUC5i7RYH) equipped with an i7-5557U CPU and Iris 6100 graphics.
According to FanlessTech, this slightly thicker NUC will come as a barebones system with a processor, motherboard, and wireless card pre-installed in a case with customizable lids (to add NFC, wireless charging, or other features). Note that, unlike the Broadwell i5 version we reviewed, this model supports 2.5” SSDs.
External I/O includes:
- 2 x USB 3.0 ports (one charging capable)
- 1 x Audio jack
- 1 x IR sensor
- 2 x USB 3.0 ports
- 1 x Gigabit Ethernet RJ45
- 1 x Mini HDMI 1.4a
- 1 x Mini DisplayPort 1.2
Internally, the NUC5i7RYH is powered by a dual core (with Hyper-Threading) i7-5557U processor clocked at 3.1 GHz base and 3.4 GHz turbo with 4MB cache and 28W TDP. The processor also features Intel’s Iris 6100 GPU which our own Scott Michaud estimates it at 48 execution units and 845 GFLOPS of performance. He further speculates that it gets to a similar level of theoretical performance as the Intel Iris 5100 graphics (used in Haswell CPUs) using more (but lower clocked at up to 1050 MHz) shaders.
The Iris 6100 GPU is likely to be the highest processor graphics we will see with Broadwell-U. It supports 4K resolutions at 24Hz as well as video decode (though apparently not hardware accelerated) of VP8, VP9, and H.265 (HVEC) via wired displays or over Intel’s WiDi wireless display technology. Further, the GPU supports DirectX 12 in its current iteration as well as OpenGL 4.3 and OpenCL 2.0.
Internal connectivity includes support for two DDR3L SODIMMs (up to 16GB), a single 2.5” solid state drive, one M.2 SSD, an Intel Wireless AC 7265 card (802.11ac+BT), a NFC header, and a header for two USB 2.0 ports.
Intel has not released pricing, but expect it to hit at least $500 since the i5 version without Iris graphics has an MSRP of $399. It is slated to arrive soon with a launch window of Q2 2015.
Subject: Shows and Expos | February 24, 2015 - 11:14 PM | Morry Teitelman
Tagged: QuakeCon 2015, quakecon, id software
Courtesy of ZeniMax Media
The yearly gaming mecca known as QuakeCon, featuring the biggest BYOC (bring your own computer) LAN in the great state of Texas, is set to kick-off starting July 23 through the 26. What you didn't know is that the QuakeCon team announced the registration dates for the event. Like last year, all pre-registration spots in the BYOC will be pay-for only with no First-Come-First-Served spots available.
This year, there will be a total of five registration rounds offered:
- BYOC Select-a-Seat with UAC Command Center Seating
- 32 packages, $500 per package
- Wednesday, March 4 at 7pm CST / 8PM EST
- BYOC Select-a-Seat with QuakeCon done Quick
- 300 packages, $175 per package
- Wednesday, March 11 at 7pm CST / 8PM EST
- BYOC Select-a-Seat + Swag Pack
- 500 packages, $170 per package
- Wednesday, March 18 at 7pm CST / 8PM EST
- BYOC Select-a-Seat
- 1600 packages, $55 per package
- Wednesday, March 25 at 7pm CST / 8PM EST
- Swag Pack
- 50 packages, $125 per package
- Wednesday, April 1 at 7pm CST / 8PM EST
If you a familiar with the QuakeCon pay-for packaging strategies, most of the packages look familiar. The newest package offering is the UAC Command Center Seating package, featuring a VIP seat in the NOC with direct access for your system to the backbone and guaranteeing you the fastest network access at the event.
Subject: General Tech | February 25, 2015 - 03:35 PM | Jeremy Hellstrom
Tagged: VLAN party, kick ass, Homeworld Remastered, gaming, fragging frogs
That's right, for those of you who pre-ordered Homeworld Remastered and for anyone that pops by Steam to purchase it, your productivity is in for a serious hit as you try to guide your fleet to a new homeworld and then defend it. For those lucky and old enough to have played through it originally you will find the look vastly improved and from what Rock, Paper, SHOTGUN and other reviewers have found you will also love the improved interface. For those who have not had the pleasure of playing through these two games before, the $33 investment is more than worth it, especially with improved multiplayer coming in the near future. Check out the videos and overview of the poster child for revamped legacy games here.
You will have to take a break this Saturday though, as the Fragging Frogs Virtual LAN party #9 kicks off at 10AM ET and will end when the last frog drops. You can check out the official thread in the forums right here to get all the information you need to participate. AMD and other mystery sponsors will be giving away prizes to those who log into and participate in the TeamSpeak channels; not to mention it is the best way to chat in game and in the general lobby. You can also check out the list of games that will be played as well as links to the mods and patches you will need, please download and install them before Saturday to maximize your playing time. See you there!
"In terms of strategy games which ‘need’ remastering, Homeworld was probably somewhere at the bottom of the list. But in terms of strategy games which really, truly benefit from remastering – well, this is a chart-topper."
Here is some more Tech News from around the web:
- Touch The Sky: Sid Meier’s Starships Release On March 12 @ Rock, Paper, SHOTGUN
- The Order: 1886 – Round Table game's all right on the knight @ The Register
- Sunless Sea game review @ Bjorn3d
- Saints Row: Gat Out of Hell Review @ OCC
- What Evolve could learn from Monster Hunter @ Kitguru
- GTA V for PC launch re-scheduled to 14th April 2015 @ HEXUS
- Ziggy's Mod (Far Cry 3) @ Nexus
- 1 of 2