All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | June 18, 2017 - 07:01 AM | Scott Michaud
Tagged: pc gaming, hitman
During the most recent SquareEnix financial earnings release (PDF) they announced that they would withdraw from IO Interactive, who makes the Hitman series of video games. Their most recent release, Hitman 2016, is one of our major benchmarks because it was one of the first titles to rework its engines for DirectX 12 (and it’s also a very pretty game). The previous game, Hitman: Absolution, was also featured on one of our live streams because it was an AMD Gaming Evolved / Never Settle title.
IO Interactive has followed-up with their own announcement. As of last Friday, they are now an independent studio, and they were able to negotiate both their management and the Hitman IP. “We are now open to opportunities with future collaborators and partners to help strengthen us as a studio and ensure that we can produce the best games possible for our community.” In other words, they don’t seem to have any publisher lined up, but the Hitman franchise should be enticing for many AAA-level companies.
Just a couple weeks earlier, IO Interactive also announced that new purchases of Hitman would automatically buy all episodes from the first season. Steam will, however, detect existing episodes and only bill you for the ones you’ve missed. They say that “these changes will help us lay the foundations for our future plans for HITMAN” but it’s unclear what they mean at this point.
Subject: General Tech, Graphics Cards | June 17, 2017 - 09:23 PM | Ken Addison
Tagged: nicehash, mining, cryptocurrency
Over the last several weeks, we have been experimenting with the most recent GPU-shortage-inducing coin mining craze, with Ken's article as a jumping off point. On a recent podcast, I mentioned the idea of running a community coin mining group that would be used as a way for individuals to contribute to PC Perspective. I received several requests for the wallet and setup information to make this happen, so I thought it would be worth while to gather all the necessary links and info in a single location.
We have been running a Patreon campaign for a couple of years now on the site as a way to provide an avenue for those readers and viewers that find PC Perspective a useful resource to the community and directly contribute. It might be because you want to keep the PCPer staff stable, it could be because you use an ad blocker and are looking for a way to even things out, etc. But there are always some that don't have the ability or desire to sign up for a new service so contributing your empty GPU cycles is another option if you want to donate to the PCPer team.
How do you do it? Ken has created a step by step guide below - thanks for your support in this and all of our previous endeavors!
- Bitcoin: 1HHhVWPRpCUst9bDYtLstMdD7o5SzANk1W
- Ethereum: 0xa0294763261aa85eB5f1dA3Ca0f03E1B672EED87
For those of you who may be curious to try out this mining stuff on your personal computer, we would recommend looking into the NiceHash application.
For those of you who haven't read our previous article, NiceHash is a service that connects buyers of GPU mining power to sellers who have spare hardware that they are looking to put to use.
As a warning, if you are planning to mine please be aware of your power consumption. To get a good idea of this, you can look up the TDP of your given graphics card, multiply that wattage by the hours you plan to mine, divide by 1000 to translate from watts to kilowatts, and multiply that by the rate you pay for electricity (this can be found on your power bill in cents per Kilowatt/Hour in the US). (So it's watts*hours*days/1000*kw/hr rate - Thanks CracklingIce)
Given the current rates of value for these cryptocurrencies, power is a small portion of the gross profit made by mining, but it is important to be aware of this before you are presented with a huge power bill that you weren't expecting.
First, download the latest version of the NiceHash miner application from their website.
After your download has finished, extract the ZIP file and load the NiceHashMiner.exe program.
Once the application has been launched and you've accepted the terms of the EULA, the NiceHash Miner will start to download the appropriate mining applications for your given hardware.
Note: during this installation process, your antivirus program might detect malware. These miner executables that are being downloaded are safe, but many antivirius programs flag them as malware because if they are found on your PC without your permission they are a telltale sign of malicious software.
After the installation process is completed, you be brought to the main screen of the application.
From here, choose the server location closest to you, add the Bitcoin address (in this case: 1HHhVWPRpCUst9bDYtLstMdD7o5SzANk1W), and choose a unique worker name (up to 7 characters long).
From here, hit the benchmark button, select the devices you want to mine on (we would recommend GPUs only, CPUs don't earn very much), and hit the Start button.
Once the benchmarking is done, you'll be brought back to the main screen of the application where you can hit the Start button.
Once you hit the start button, a command prompt window will launch where you can see the miner at work (this can be hidden from the NiceHash setting pane), and you can view the stats of your computer in the original NiceHash application window.
And that's it, your computer will now be mining towards the PCPER community pool!
Subject: General Tech | June 16, 2017 - 02:38 PM | Jeremy Hellstrom
Tagged: gaming headset, cerebus, audio, asus
The ASUS Cerberus is a little different from other headsets, for instance it uses 53mm drivers and two microphones, a removable boom microphone and an in-line microphone pemanently attached. For audiophiles, the headset has a 32 Ω impedance and 20-20,000 Hz frequency response and a somewhat muddy sound; for gamers it has very heavy bass which can make explosions quiet startling. TechPowerUp were not in love with the audio performance but found the headset to be extremely comforatable so it can be perfect for those who prefer comfort over a beautiful audio space.
"Asus Cerberus V2 is the successor to the company's bestselling headset. Now equipped with a stainless steel headband and the new "Essence drivers", it's supposed to be sturdier and better sounding. However, with its $75 price tag, it faces some stiff competition and doesn't necessarily come out as the victor."
Here is some more Tech News from around the web:
- Cougar Immersa @ techPowerUp
- HyperX CLOUD Revolver S Pro Gaming Headset Review @ NikKTech
- 1MORE Capsule Dual Driver In-ears @ techPowerUp
- Cougar Megara @ techPowerUp
Subject: General Tech | June 16, 2017 - 01:29 PM | Jeremy Hellstrom
Tagged: msi, VR One, htc vive, oculus rift
MSI states their VR One is the world’s lightest and thinnest backpack PC system with high performance, which makes sense considering the utter lack of competition in that area. It may also claim to be the most expensive, as the price ranges from $1700 to $2300 in cost; [H]ard|OCP tested out the high end model in their recent review. Inside is a Kaby Lake Core i7-7820HK, 16GB of 2166MHz DDR4, dual M.2 storage drives, and the mobile version of the GTX 1070; certainly enough to power a Rift or Vive. The battery life is more impressive than you might expect, starting from 92% it lasted 1 hour and 37 and from 96% 1 hour and 41 minutes, with 2 hours required to recharge the battery over 95%. It is an investment but being able to experience VR without tripping on cords is an attractive proposition.
"The MSI VR ONE is quite simply a full PC that comes in the form of a backpack that allows you to connect your HTC Vive or Oculus Rift for a "wireless" VR experience. This VR ONE unit packs a GTX 1070 laptop GPU to hopefully supply us with the needed 90 frames per second performance required for a perfect Virtual Reality experience."
Here is some more Tech News from around the web:
- CIA's 'CherryBlossom' hacking tool allows router traffic to be intercepted @ The Inquirer
- Months late, unaudited: ZX Spectrum reboot firm files accounts @ The Register
- Banking websites are 'littered with trackers' ogling your credit risk @ The Register
- True Cross-play Gaming Could Become A Reality If Sony Wasn’t Holding Out @ Techgage
- Windows Server joins Windows 10 and Office in the bi-annual updates club @ The Inquirer
- Lenovo Reveals The 2017 ThinkCentre Desktop PCs @ TechARP
- EU Poised To Fine Google More Than $1 Billion in Antitrust Case @ Slashdot
- Xtorm EVOKE 10.000mAh Solar Charger Review @ NikKTech
Subject: Processors | June 15, 2017 - 04:00 PM | Ryan Shrout
Tagged: xeon scalable, xeon, skylake-x, skylake-sp, skylake-ep, ring, mesh, Intel
Though we are just days away from the release of Intel’s Core i9 family based on Skylake-X, and a bit further away from the Xeon Scalable Processor launch using the same fundamental architecture, Intel is sharing a bit of information on how the insides of this processor tick. Literally. One of the most significant changes to the new processor design comes in the form of a new mesh interconnect architecture that handles the communications between the on-chip logical areas.
Since the days of Nehalem-EX, Intel has utilized a ring-bus architecture for processor design. The ring bus operated in a bi-directional, sequential method that cycled through various stops. At each stop, the control logic would determine if data was to be the collected to deposited with that module. These ring bus stops are located at memory controllers, CPU cores / caches, the PCI Express interface, memory controllers, LLCs, etc. This ring bus was fairly simple and easily expandable by simply adding more stops on the ring bus itself.
However, over several generations, the ring bus has become quite large and unwieldly. Compare the ring bus from Nehalem above, to the one for last year’s Xeon E5 v5 platform.
The spike in core counts and other modules caused a ballooning of the ring that eventually turned into multiple rings, complicating the design. As you increase the stops on the ring bus you also increase the physical latency of the messaging and data transfer, for which Intel compensated by increasing bandwidth and clock speed of this interface. The expense of that is power and efficiency.
For an on-die interconnect to remain relevant, it needs to be flexible in bandwidth scaling, reduce latency, and remain energy efficient. With 28-core Xeon processors imminent, and new IO capabilities coming along with it, the time for the ring bus in this space is over.
Starting with the HEDT and Xeon products released this year, Intel will be using a new on-chip design called a mesh that Intel promises will offer higher bandwidth, lower latency, and improved power efficiency. As the name implies, the mesh architecture is one in which each node relays messages through the network between source and destination. Though I cannot share many of the details on performance characteristics just yet, Intel did share the following diagram.
As Intel indicates in its blog on the mesh announcements, this generic diagram “shows a representation of the mesh architecture where cores, on-chip cache banks, memory controllers, and I/O controllers are organized in rows and columns, with wires and switches connecting them at each intersection to allow for turns. By providing a more direct path than the prior ring architectures and many more pathways to eliminate bottlenecks, the mesh can operate at a lower frequency and voltage and can still deliver very high bandwidth and low latency. This results in improved performance and greater energy efficiency similar to a well-designed highway system that lets traffic flow at the optimal speed without congestion.”
The bi-directional mesh design allows a many-core design to offer lower node to node latency than the ring architecture could provide, and by adjusting the width of the interface, Intel can control bandwidth (and by relation frequency). Intel tells us that this can offer lower average latency without increasing power. Though it wasn’t specifically mentioned in this blog, the assumption is that because nothing is free, this has a slight die size cost to implement the more granular mesh network.
Using a mesh architecture offers a couple of capabilities and also requires a few changes to the cache design. By dividing up the IO interfaces (think multiple PCI Express banks, or memory channels), Intel can provide better average access times to each core by intelligently spacing the location of those modules. Intel will also be breaking up the LLC into different segments which will share a “stop” on the network with a processor core. Rather than the previous design of the ring bus where the entirety of the LLC was accessed through a single stop, the LLC will perform as a divided system. However, Intel assures us that performance variability is not a concern:
Negligible latency differences in accessing different cache banks allows software to treat the distributed cache banks as one large unified last level cache. As a result, application developers do not have to worry about variable latency in accessing different cache banks, nor do they need to optimize or recompile code to get a significant performance boosts out of their applications.
There is a lot to dissect when it comes to this new mesh architecture for Xeon Scalable and Core i9 processors, including its overall effect on the LLC cache performance and how it might affect system memory or PCI Express performance. In theory, the integration of a mesh network-style interface could drastically improve the average latency in all cases and increase maximum memory bandwidth by giving more cores access to the memory bus sooner. But, it is also possible this increases maximum latency in some fringe cases.
Further testing awaits for us to find out!
Subject: Cases and Cooling | June 15, 2017 - 01:34 PM | Jeremy Hellstrom
Tagged: tempered glass, corsair, Crystal Series, 570x
It has been quite a while since Sebastian reviewed Corsair's Crystal Series 570X tempered glass case; so why not take another look? Over at Techgage you can revist this case with a view. They were impressed by the cooling included, three fans and a pre-installed fan hub for three more RGB fans as well as the air filter placements which help keep dust out of the case. There is no equivalent feature to get fingerprints off of the glass front and sides so you will spend some time cleaning up your case. Then again, if you are choosing a transparent enclosure, you likely spend a lot of time ensuring all your components are looking their best.
"Corsair’s Crystal series is named as such because of its use of tempered glass, and as the top dog in the current lineup, the 570X sports that tempered glass on all four sides. Despite its delicate frame, the chassis proved great to build with, and as we found out, its beautiful aesthetics don’t hurt its cooling efficiency."
Here are some more Cases & Cooling reviews from around the web:
- Gamemax Polaris (RGB Tempered Glass) @ Kitguru
- Bitfenix Portal Mini-ITX Case @ Benchmark Reviews
- Nanoxia CoolForce 1 Mid-Tower Review @ NikKTech
- Cooler Master MasterCase Pro 6 Review @ Bjorn3d
- Streacom FC8 Alpha mini-ITX chassi @ Bjorn3d
- LEPA NEOllusion @ techPowerUp
Subject: General Tech | June 15, 2017 - 01:07 PM | Jeremy Hellstrom
Tagged: blame canada, crtc, spineless
Canada's equivalent of the USA's FCC, managed to collect enough backbone to utter a statement about directing Canadian mobile carriers to unlock any of their phones without charging money for doing so. The current going rate is around $50, a bit less if you trust that shady character in the alley.
They also murmured something about how it was inappropriate to allow a child to simply reply yes back to a text from their provider to authorize international roaming charges over $100 a month or data overage fees at $50. They politely inquired if the phone companies might consider following the CRTC's new suggestion that only the authorized account holder have approval, even if the teenager swears their parental unit totally said it was OK.
As is their wont, no mention of penalties was made nor did they seem to have acquired any teeth since last they pulled their heads out into the light. You can read more at Slashdot or the CBC, depending on your preference of comments.
"Canada's telecom regulator has announced that as of December 1st, 2017, all individual and small business wireless consumers will have the right to have their mobile devices unlocked free of charge upon request, while all newly purchased devices must be provided unlocked from that day forward."
Here is some more Tech News from around the web:
- Nokia snatches clump of 16nm FinFETs, crafts 576 Tbps monster router @ The Register
- Brazilian whacks Intel over 'exploding' Atom smartphone chips @ The Register
- Practical Networking for Linux Admins: TCP/IP @ Linux.com
- Buying a Memory Card – What You Need to Know @ Hardware Secrets
Subject: General Tech | June 15, 2017 - 10:38 AM | Alex Lustenberg
Tagged: xps, video, Samsung, Project Scorpio, powerplay, podcast, logitech, G433, g-sync, freesync, destiny 2, dell, cryptocurrency, corsair, Area-51, alienware
PC Perspective Podcast #454 - 06/15/17
Join us for talk about Cryptocurreny mining resurgence, XBox One X, and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store (audio only)
- Google Play - Subscribe to our audio podcast directly through Google Play!
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, Allyn Malventano
Peanut Gallery: Alex Lustenberg, Ken Addison
Subject: Graphics Cards | June 14, 2017 - 08:42 PM | Tim Verry
Tagged: zotac, gtx 1080 ti, factory overclocked, gp102, SFF
Zotac recently unveiled a slimmed down GTX 1080 Ti graphics card that uses a dual slot and dual fan cooler with a short PCB. The aptly named Zotac GTX 1080 Ti Mini measures 8.3” (211mm) long and will be the smallest GTX 1080 Ti on the market. Despite the miniaturization, Zotac is still offering a decent factory overclock on the Pascal GPU (but not the memory) with a boost clock of 1620 MHz versus the reference boost clock of 1582.
Zotac uses two 8-pin PCI-E power connectors to drive the card with its GTX 1080 Ti GPU (3584 CUDA cores) and 11GB of GDDR5X memory clocked at 11 GHz. The slimmed down graphics card features a metal backplate, dual shrouded fans, and a heatsink with aluminum fins and five 6mm heat pipes. The card has three DisplayPort 1.4 ports, one HDMI 2.0b port, and one DL-DVI output with the card supporting up to four simultaneous displays.
The Zotac GTX 1080 Ti Mini should enable quite a bit of horsepower in small form factor systems. The graphics card is model number ZT-P10810G-10P and Zotac has it listed on its website. Unfortunately, Zotac is not yet talking pricing or availability for the shortened card.
It appears that overclocking is not out of the question, but I am curious just how far it could be pushed especially in a small case with tight quarters and less airflow.
Subject: Displays | June 14, 2017 - 03:06 PM | Jeremy Hellstrom
Tagged: UP3218K, ultrasharp, dell, 8k
Ars Technica had the chance to test Dell's new $5000 UltraSharp UP3218K, a 32" 10-bit IPS panel with a resolution of 7680×4320. It uses two DisplayPort 1.4 connections to drive this beast and as even the GTX 1080 Ti struggles with high graphics settings at 4k there are some performance problems. Ars was able to test Rise of the Tomb Raider, Metro: Last Light, and GTA V and while they ran at 8K on a single GTX 1080 Ti; "they also crashed. A lot." GTA V performed the best of the lot, reaching a high of 50FPS and a low of 15FPS, though they looked very pretty while doing so. Drop by to download a screenshot and pan around to get a sense of what this screen can do.
"While Acer's 4K, HDR-ready, 144Hz Predator X27 gaming monitor is pretty hot, Dell has something even better: the 8K Dell UltraSharp UP3218K (buy here). This, if you're unfamiliar, is a display that sports a whopping 7680×4320 pixels spread over a 32-inch 10-bit IPS panel."
Here are some more Display articles from around the web:
- Acer Predator XB252Q 240Hz G-SYNC @ Kitguru
- Nixeus NX-VUE27P 1440P IPS Monitor Review @ Hardware Canucks
- AOC PDS241 24in Monitor @ Kitguru
Subject: General Tech | June 14, 2017 - 01:51 PM | Jeremy Hellstrom
Tagged: linux, gaming, dawn of war III
Dawn of War 3 released its Linux version earlier this year with support for both OpenGL and Vulkan. Vulkan performance is much better in CPU bound testing with resolutions under 1080p and when gaming above that resolution it utilizes far less CPU resources than OpenGL. Overall on NVIDIA performance is the same on both APIs, with the current Radeon driver you are better off on OpenGL. As is their usual style, Phoronix tested 18 GPUs, a dozen from NVIDIA and six of AMD's cards with differing resolutions and graphics quality settings, all the way up to 4k.
"Today marks the highly anticipated debut of Dawn of War III for Linux (and macOS) ported by Feral Interactive. Here are a number of OpenGL and Vulkan benchmarks of NVIDIA GeForce and AMD Radeon graphics cards running Ubuntu Linux with this game."
Here is some more Tech News from around the web:
- Every PC game announced or trailered at E3 2017 @ Rock, Paper, SHOTGUN
- Humble Bundle E3 2017 Digital Ticket
- Saved Games: Interstate ‘76 is the game worth saving from 1997 @ Rock, Paper, SHOTGUN
- Xbox One X: A High End Console With Fixable Shortcomings @ Techgage
- Bethesda jack in for Doom VFR and Fallout 4 VR this year @ Rock, Paper, SHOTGUN
- Oculus Rift VR Benching – AMD vs. NVIDIA – Part 2 @ BabelTechReviews
- XCOM 2: War Of The Chosen coming August 29th @ Rock, Paper, SHOTGUN
- Some of the best E3 2017 PC gaming videos so far @ Hexus
- WH40K: Dawn of War 3 adding Dawn of War-ier modes @ Rock, Paper, SHOTGUN
Subject: General Tech | June 14, 2017 - 12:42 PM | Jeremy Hellstrom
Tagged: irobot, automation, tertill, Kickstarter
While the level of enjoyment that gardening instills in a person varies there is one thing we should be able to agree upon; weeding sucks. The team that brought you the Roomba has a solution in mind and they have launched a Kickstarter for Tertill, the robotic weed destroyer. You can contribute to the project and pick up this solar powered robotic weed eater for as little as $225 for delivery in time for next years gardening season. Instead of using robotic vision, which can have some interesting interpretations of objects, it will munch anything short enough to pass underneath it with its spinning string trimmer, unless it has one of the provided collars around it to protect it. The collar in the video seems to be easily replicable with some wire and pliers if you have enough baby plants you need extra. Drop by to take a look at the campaign and the Tertill in action.
"iRobot veteran and Roomba co-inventor, Joe Jones is a modest man with a big mission: to create robots that make agriculture more efficient, less tedious, and yes, maybe even one day feed the world. After a decade at Harvest Automation building greenhouse robots, his new team at Franklin Robotics has developed Tertill, an affordable, waterproof, solar-powered robot that continuously whacks weeds around your yard."
Here is some more Tech News from around the web:
- Microsoft Fixes Unsupported AMD Hardware @ [H]ard|OCP
- PCIe speed to double by 2019 to 128GB/s @ The Register
- The 2017 AORUS Gaming Motherboards For Intel & AMD @ TechARP
- Intel announces partnerships for the gaming market @ DigiTimes
- Microsoft patches Windows XP due to 'elevated risk' of WannaCry-style attacks @ The Inquirer
- Open Source TurtleBot 3 Robot Kit Runs Ubuntu and ROS on Raspberry Pi @ Linux.com
- OnePlus 5 mega-leak confirms iPhone-esque design, 3.5mm headphone jack @ The Inquirer
- Linksys LGS124P 24-Port Business Gigabit PoE+ Switch Review @ NikKTech
Subject: General Tech | June 13, 2017 - 07:02 PM | Tim Verry
Tagged: vpro, SFF, sbc, modular computer, Intel, computex, compute card
Launched earlier this year at CES, Intel’s credit card sized Compute Cards will begin shipping in August. Intel and its partners used Computex to show off the Compute Card itself along with prototype and concept devices based around the new platform.
techtechtech opened up the Core M3-7Y30 equipped Compute Card at Computex.
As a quick refresher, the Compute Card is a full PC in a small card shaped form factor measuring 95mm x 55mm x 5mm that features an Intel SoC, DDR3 RAM, solid state storage, wireless connectivity, and standardized I/O (one USB-C and a proprietary Intel connector sit side by side on one edge of the card). The small cards are designed to slot into devices that will use the Compute Card as their brains for smart home automation, appliances, industrial applications, smart whiteboards, and consumer products such as tablets, notebooks, and smart TVs.
At its Computex press events, Intel revealed details on specifications. The initial launch will include four Compute Card SKUs with two lower end and two higher end models. All four of the cards are equipped with 4GB of DDR3 RAM and either 64GB of eMMC or 128GB SSD storage. The two lower end SKUs use Intel Wireless-AC 7265 while the more expensive models have Intel Wireless-AC 8265 (both are 2x2 802.11ac and Bluetooth 4.2). Processor options from top to bottom include the 7th generation Intel i5-7Y57, Core m3-7Y30, Pentium N4200, and Celeron N3450. Enterprise customers will appreciate the TPM support and security features. Reportedly, the Compute Cards will start at $199 for the low-end model and go up to $499+ for the higher end cards.
Intel partners Dell, HP, and Lenovo were reportedly not ready to show off any devices but will launch Compute Card compatible devices at some point. ECS, Foxconn, LG Display, NexDock, Sharp, and others did have prototype devices at Computex and have announced their support for the platform. The Compute Card concept devices shown off include tablets, laptops, All In Ones, digital signage, kiosks, and a monitor stand dock that lets the user add their own monitor and have an AIO powered by a Compute Card. Other uses include ATMs, smart whiteboards, mini PCs for desktop and HTCP uses, and docks that would allow business user sand students to have a single PC with storage that they could take anywhere and get work done. Students could plug their Compute Card into a laptop shell, computer lab PC, whiteboard for presentations, their home dock, and other devices..
(My opinions follow:)
It is an interesting concept that has been tried before with smartphones (and Samsung is currently trying with its S8 and docks) but never really caught on. The promise and idea of being able to easily upgrade a smart TV, computer, smart appliance, home security system, ect without having to replace the entire unit (just upgrading the brains) is a great one, but thus far has not really gained traction. Similarly, the idea of a single PC that you carry everywhere in your pocket and use whatever display you have handy has been promised before but never delivered. Perhaps Intel can drive this modular PC idea home and we could finally see it come to fruition. Unexpectedly absent from the list of partners is Asus and Samsung. Samsung I can understand since they are trying to do their own thing with the S8 but I was a bit surprised to see Asus was not out front with a Compute Card support as they were Intel's partner with its Zenfone and they seem like a company with a good balance of R&D and manufacturing power but nimble enough to test out new markets. The other big PC guys (Dell, HP, and Lenovo) aren't ready with their devices yet either though so I guess we will just have to see what happens in terms of support and adoption. The other thing that could hold the Compute Card back is that Intel will reportedly allow manufacturer lock-in where devices and Compute Cards can be made to only work with hardware from the same manufacturer. Restricting interoperability might hurt the platform, but it might aslo creat less confusion for consumers with the onus being on each manufacturer to actually support an upgrade path I guess.
What are your thoughts on the Compute Card?
Subject: General Tech | June 13, 2017 - 04:08 PM | Scott Michaud
Tagged: xbox, pc gaming, microsoft
Before we begin, the source of this post is a PC Gamer interview with Microsoft’s Phil Spencer, who leads the Xbox team. The tone seems to be relaxed and conversational, so, for now, it should be taken as something that he, personally, wants to see, not what the division is actually planning, necessarily.
Still, after it was announced that the Xbox One would get emulation for original Xbox titles at the Xbox E3 2017 Press Conference, PC Gamer asked whether that feature, like so many others lately, could make it to the PC.
His responses: “Yes.” and “I want people to be able to play games!”
He also talked about Xbox 360 emulation on PC, specifically how it would be difficult, but he wants games to run across console and PC. “I want developers to be able to build portable games, which is why we’ve been focusing on UWP for games and even apps that want to run on multiple devices.”
You might know my personal opinions about UWP by now, specifically how it limits artistic freedom going forward through signed apps and developers, which is a problem for civil rights groups that either need to remain anonymous or publish expressions that governments (etc.) don’t want to see public, but cross-device is indeed one of the two reasons that it’s seductive for Microsoft. Content written for it (unless it finds an unpatched exploit, like how Apple iOS jailbreaks work) cannot do malware-like things, and they should be abstract enough to easily hop platforms.
But you won’t see me talk ill about preserving old content, especially if it could be lost to time based on a platform decision they made fifteen years ago. I hope that we do see original Xbox games on the PC. I also hope that we develop art in a medium that doesn’t need awkward methods of preservation, though.
Subject: General Tech, Graphics Cards | June 13, 2017 - 01:39 PM | Jeremy Hellstrom
Tagged: nvidia, free games, evga, destiny 2
Were you a fan of the original Destiny or simply a fan of free games and happen to be shopping for a new NVIDIA GPU? EVGA have just launched a new giveaway, if you pick up one of their GTX 1080 or 1080 Ti's they will provide you with a code that not only provides you with a free copy of Destiny 2 but also allows you access to the beta.
As usual you need to have an EVGA account so you can register your GPU and so the code can be provided to your account. From there head on over to NVIDIA to redeem the code and patiently await the start of the beta and final release of the game.
June 13th, 2017 - Get Game Ready with EVGA GeForce GTX 10 Series and experience Destiny 2 on PC. For a limited time, buy a select EVGA GeForce GTX 1080 Ti or EVGA GeForce GTX 1080 graphics card and get Destiny 2 at PC Launch and [Early] Access to the PC Beta!
GeForce GTX 10 Series GPUs brings the beautiful world of Destiny 2 to life in stunning 4K. Experience incredibly smooth, tear-free gameplay with NVIDIA G-SYNC™ and share your greatest gameplay moments with NVIDIA ShadowPlay using GeForce Experience.
About Destiny 2:
Humanity's last safe city has fallen to an overwhelming invasion force, led by Ghaul, the imposing commander of the brutal Red Legion. He has stripped the city's Guardians of their power, and forced the survivors to flee. You will venture to mysterious, unexplored worlds of our solar system to discover an arsenal of weapons and devastating new combat abilities. To defeat the Red Legion and confront Ghaul, you must reunite humanity's scattered heroes, stand together, and fight back to reclaim our home.
Learn more and see qualifying EVGA cards at https://www.evga.com/articles/01112/destiny-2-game-ready/
Subject: Graphics Cards | June 13, 2017 - 01:17 PM | Jeremy Hellstrom
Tagged: nvidia, gtx 1080 ti, GTX 1080 Ti GAMING X, msi, Twin Frozr VI, 4k
MSI's latest version of the GeForce GTX 1080 Ti is their GAMING X 4K and has the design features you would expect, Twin Frozr VI, Hi-C CAPs, Super Ferrite Chokes and Japanese Solid Caps. When benchmarking the card [H]ard|OCP saw performance significantly higher than the quoted 1657MHz boost speed, the average was 1935MHz before they overclocked and an impressive 2038MHz for the highest stable in game frequency. They tested both the default and overclocked frequencies against a battery of benchmarks, including the newly released Prey. The card performed admirably at 4k, with many games still performing will with all graphics options at maximum, drop by for a look.
"We review a custom GeForce GTX 1080 Ti based video card with custom cooling and a factory overclock built for overclocking. Can the MSI GeForce GTX 1080 Ti GAMING X truly deliver a consistent enjoyable high-end graphics setting gameplay experience in games at 4K finally? Is a single card viable for current generation gaming at 4K?"
Here are some more Graphics Card articles from around the web:
- Asus ROG Strix GeForce GTX 1080 OC Edition 8GB 11Gbps Video Card Review @ Bjorn3d
- 15-Way NVIDIA/AMD OpenCL GPU Linux Benchmarks Of Ethereum Ethminer @ Phoronix
- XFX RX 460 4GB Heatsink Edition Review @ Bjorn3d
- XFX Rs XXX Edition Rx 570 4GB OC Review @ Bjorn3d
Subject: General Tech | June 13, 2017 - 12:31 PM | Jeremy Hellstrom
Tagged: security, cryptocurrency, Raspberry Pi
If you are using a Raspberry Pi and did not set up two factor authentication or even worse, never changed the default passwords on the system then there is a very good chance you are mining for someone other than yourself. There is a new piece of malware out there, in addition to the many which already exist, targeting Raspberry Pi machines and recruiting them into a mining group, instead of the usual usage which is to enlist them in a botnet for DDOS attacks. Hack a Day has some additional suggestions, over and above the glaringly obvious recommendation to not keep default passwords; at least in this particular case they are not hard coded into the system.
"According to Russian security site [Dr.Web], there’s a new malware called Linux.MulDrop.14 striking Raspberry Pi computers. In a separate posting, the site examines two different Pi-based trojans including Linux.MulDrop.14. That trojan uses your Pi to mine some form of cryptocurrency. The other trojan sets up a proxy server."
Here is some more Tech News from around the web:
- Ta-ta, security: Bungling Tata devs leaked banks' code on public GitHub repo, says IT bloke @ The Register
- Why Ethereum Is Outpacing Bitcoin @ Slashdot
- WiMax routers from Huawei and ZTE are vulnerable to authentication bypass attacks @ The Inquirer
- Mac ransomware author is giving away malicious code to script kiddies @ The Register
- Biostar, ASRock, Colorful see rising demand for mining motherboards, says paper @ DigiTimes
- Move over, Stuxnet: Industroyer malware linked to Kiev blackouts @ The Register
Subject: Displays | June 12, 2017 - 07:01 PM | Scott Michaud
Tagged: g-sync, free sync, dell, alienware, 240Hz
Also at the E3 event, Alienware launched a gaming monitor with two SKUs: one with G-Sync and one with FreeSync. Otherwise, these displays are apparently identical. They also apparently have lighting on the back, although it’s unclear whether this is RGB or locked to the Alienware shade of teal. (I’m guessing it’s Alienware teal.) At first, I was wondering why you would even want a light behind a display at all, but I guess it would make sense if it was very low power and you could leave it on while the rest of the display is off, giving a slight glow to an otherwise dark room.
As for the specifications: both of these displays operate at 240 Hz, native, not overclocked. To achieve this rate, its panel is 24.5-inch, 1080p, and TN. The structure itself has a thin bezel on the top, left, and right side, although the bottom has a bit more thickness for the Alienware typeface logo and buttons. Despite being otherwise identical, the G-Sync model (AW2518H) has an MSRP of $699.99, while the FreeSync model (AW2518HF) is $200 cheaper at $499.99.
Both models launch on June 13th.
Subject: General Tech | June 12, 2017 - 07:01 PM | Scott Michaud
Tagged: gaming mouse, e3 17, E3, dell, alienware
As mentioned in the Alienware mechanical keyboard news, the brand is pushing back into gaming peripherals at this year’s E3 conference. This announcement is for their pair of RGB-lit gaming mice, the Alienware Advanced Gaming Mouse (AW558) and the Alienware Elite Gaming Mouse (AW958), which consists of a base model and a higher-end one variant with more customization.
Both mice are built on a nine-button, right-handed chassis. It’s difficult to tell from the photos, but it looks like the mouse has three buttons on the thumb side, two on the pinky side, and an extra button on the top (as well as left, right, and scroll wheel click, of course). I could be wrong about this, though. The RGB lighting, two strips of it below the buttons and one going up the palm rest, forming a triangular crosshair, is available on both models.
So what’s different about the Elite? The higher-end mouse can have its side-grips replaced to change up the form and feel of the thumb buttons. It can also have weights added to it, which should help twitch gamers get used to it quicker, because they can make it feel slightly more familiar. Interestingly, the higher-end model (AW958) can store five DPI profiles, while the lower-end one (AW558) can only store three. I don’t know why they didn’t just let both choose five.
The Alienware Advanced Gaming Mouse (AW558) has an MSRP of $49.99 USD and the Alienware Elite Gaming Mouse (AW958) has an MSRP of $89.99 USD. They are available on June 13th.
Subject: Systems | June 12, 2017 - 07:00 PM | Sebastian Peak
Tagged: Threadripper, sli, ryzen, RX 580, PC, gtx 1080 ti, gaming, desktop, dell, crossfire, amd, alienware
Dell has revealed their new Alienware Area-51 gaming desktops featuring the latest high-performance AMD and Intel processors. We will begin with a look at the Alienware Area-51 Threadripper Edition, and Dell has an exclusive on pre-built systems using the new Ryzen Threadripper CPUs.
"Through 2017, Dell will be the exclusive OEM partner to deliver AMD Ryzen Threadripper pre-built systems to the market and the high-end 16-core will be factory-overclocked across all 16-cores and 32 logical threads. The Area-51 Threadripper Edition is ideal for customers who explore the world of mega-tasking, doing many system demanding tasks at the same time, and are looking for a complete, reliable solution from a trusted brand."
The systems are based on the X399 chipset and can be configured with either a 12-core/24-thread or 16-core/32-thread AMD Ryzen Threadripper processors, which are liquid-cooled in all configurations. Standard memory configurations begin with quad-channel 2667 MHz DDR4 up to 64GB, with 2933MHz HyperX memory up to the same quad-channel 64GB available. Graphics options begin with a choice between an NVIDIA GeForce GTX 1050 Ti or AMD Radeon RX 570, and max out at either dual GTX 1080 Ti or triple Radeon RX 580 cards.
Storage options include up to a 1TB M.2 PCIe SSD and 2TB 7200RPM SATA 6Gb/s HDD, and networking is handled by dual Killer E2500 Gigabit NICs and a choice of either Dell 1820 802.11ac 2x2 or Killer 1535 802.11ac 2x2 Wi-Fi. (A look at the other Area-51 desktop annoucement provides a more complete look at the rest of the general specifications - with a few chipset-related differences.)
Features from Dell/Alienware:
- Designed for Megatasking, game streaming and more, the new Area 51 Threadripper Edition is ready for today’s most demanding PC gaming enthusiast and supports high performance configurations with a chipset that enables up to 64 PCIe Gen 3 lanes.
- All configuration come standard with unlocked, factory-overclocked across all cores and liquid cooled AMD Ryzen Threadripper CPUs with Alienware's most powerful liquid cooling unit to date.
- Iconic triad high quality, uniquely engineered chassis built to deliver exceptional airflow, thermal management, and user ergonomics for daily use and future upgrades.
- Supports NVIDIA SLI and AMD Crossfire graphics technology, with dual and triple GPU options
- Introduces M.2 storage options to Area-51.
- Built for gaming enthusiast wanting the absolute best gaming performance played with a VR, 4k or 8k display
- Alienware Command Center includes AlienFX, AlienAdrenaline, AlienFusion, Thermal and Overclocking Controls
The Alienware Area-51 Threadripper Edition will be available beginning
June July 27, and pricing information is not yet announced.