All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | October 4, 2017 - 09:19 PM | Scott Michaud
Tagged: render token, ethereum, 3D rendering
You know how people have been buying up GPUs to mine coin? A new company, Render Token, has just announced a service that works in a similar way, except that the output is rendered images. A better example would be something like Folding@Home, but the user is paid for the work that their computer performs. The CEO and the President, Jules Urbach and Alissa Grainger respectively, are co-founders of OTOY, which does GPU- and Cloud-accelerated rendering.
According to Jules Urbach at Unite Austin, they are apparently paying, deliberately, more than ethereum would give users for the same amount of processing power.
I am... torn on this issue. On the one hand, it’s a cool application of crowd-sourced work, and it helps utilize idle silicon scattered around the globe. On the other hand, I hope that this won’t kick GPU supply levels while they’re down. Sure, at least there’s some intrinsic value to the workload, but I can just see people sticking racks of caseless systems in their basement, while gamers keep browsing Amazon for something under four digits (excluding the cents) to appear in stock.
What do you all think? Does the workload usefulness dull the pain?
Subject: General Tech | October 4, 2017 - 08:59 PM | Scott Michaud
Tagged: 3D rendering, otoy, Unity, deep learning
When raytracing images, sample count has a massive impact on both quality and rendering performance. This corresponds to the number of rays within a pixel that were cast, which, when averaged out over many, many rays, eventually matches what the pixel should be. Think of it this way: if your first ray bounces directly into a bright light, and the second ray bounces into the vacuum of space, should the color be white? Black? Half-grey? Who knows! However, if you send 1000 rays with some randomized pattern, then the average is probably a lot closer to what it should be (which depends on how big the light is, what it bounces off of, etc.).
At Unite Austin, which started today, OTOY showed off an “AI temporal denoiser” algorithm for raytraced footage. Typically, an artist chooses a sample rate that looks good enough to the end viewer. In this case, the artist only needs to choose enough samples that an AI can create a good-enough video for the end user. While I’m curious how much performance is required in the inferencing stage, I do know how much a drop in sample rate can affect render times, and it’s a lot.
Check out OTOY’s video, embed above.
Subject: Mobile | October 4, 2017 - 04:32 PM | Sebastian Peak
Tagged: smartphone, pOLED, Pixel 2 XL, Pixel 2, pixel, phone, Oreo, google, DxOMark, Android 8, AMOLED
Google has announced the Pixel 2 and Pixel 2 XL smartphones, the second-gen versions of the Nexus-replacement Pixel which launched last October. We looked at that first Pixel phone, which was the premier Android device at the time, and these new Pixel 2 devices hope to place Google at the top of the heap again (with stiff competition from Samsung, of course).
The Google Pixel 2 XL (image credit: Google)
The Pixel 2 arrives in a standard version with a 5-inch 1920x1080 AMOLED display, and an XL version with a new 6-inch pOLED display with 2880x1440 resolution. Both phones are powered by the 8-core Snapdragon 835 and feature 4GB of RAM and the option of either 64GB or 128GB of storage (no card slot on either phone).
While the design of the Pixel 2 is largely unchanged compared to last year, with large bezels above and below the display, the Pixel 2 XL comes closer to the ever-popular “all screen” look with its smaller top/bottom bezels.
The Google Pixel 2 (image credit: Google)
Both phones offer dual front-facing stereo speakers as well, unlike iPhones which have to combine an earpiece speaker and bottom-firing speaker for their stereo effect. The battery capacities are a little different than last year with both Pixel 2 phones, with a 2700 mAh battery (down from 2770 mAh) in the Pixel 2, and a 3520 mAh battery (up from 3450 mAh) in the Pixel 2 XL.
It’s all about camera
Once again, Google is proclaiming the Pixel 2 camera as the best in the industry, and again this is based on testing from DxOMark which has it ranked #1 overall among smartphones. with an incredible 98 out of a possible 100 in their scoring system.
Image credit: DxOMark
Both sizes of Pixel 2 offer a single 12.2 MP rear camera (sorry, no dual cameras here) with 1.4μm pixels, laser + dual pixel phase detection autofocus, OIS, and a f/1.8 aperture. Fans of simulated lens bokeh have no fear, as Google’s dual-pixel sensor design is said to allow for better portrait-style photos than the original Pixel. Video of up to 4k (but only at 30 FPS) is supported, and an 8 MP f/2.4 camera handles front-facing duties.
More on those new displays
Google has improved the display technology with the Pixel 2, as both versions now offer wide color gamut support (95% DCI-P3 coverage from the Pixel 2, and a full 100% DCI-P3 from the Pixel 2 XL). The displays are now ‘always on’, a handy feature that makes sense from a power standpoint when working with AMOLED panels (and hard to give up once you’ve grown accustomed to it as I did with the Galaxy S8+). Last but not least, covering these new displays is Corning Gorilla Glass 5, which is the most drop-resistant version to date (and is also found on the Galaxy S8/S8+ among other phones).
A comparison of LCD and OLED technologies (image credit: Android Authority)
The Pixel 2 XL’s “pOLED” display designation suggests a polymer OLED panel, which has the advantage of being much thinner than traditional glass OLED substrates. (Read more about AMOLED vs. P-OLED here.)
The Pixel 2 phones ship with the new Android 8.0 Oreo, with the promise of “minimum” 3 years of OS and security updates. Vanilla Google phone owners (previously Nexus) have enjoyed being the first to new OS updates, and that should still be the case with these new devices. And if you are coming over from another platform - say, Apple, for instance - a “quick switch” adapter is in every box to help transfer data quickly between phones.
The Quick Switch Adapter in action (image credit: Google)
Google is offering the (unlocked) phone for sale directly from their website, and have partnered with Verizon as the exclusive mobile carrier as they did with the original Pixel. The price? $649 gets you the 5-inch Pixel 2 with 64GB of storage, or double that to 128GB for $100 more. The Pixel 2 XL is available for $849 for the 64GB capacity, with the same $100 premium for a 128GB version. There are also four color options this year, with the whimsical naming fully intact from the previous generation: Just Black, Clearly White, Kinda Blue, and Black & White.
Oh, and one more thing: the 3.5 mm headphone jack is gone.
Subject: General Tech | October 4, 2017 - 02:06 PM | Jeremy Hellstrom
Tagged: gaming, Star Wars, Star Wars Battlefront 2, gamespy
We lost access to the original Star Wars Battlefront 2 back in 2014 when Gamespy's servers were shut down but thanks to a dedicated group of fans and support from Disney the game is once again playable. There are some issues currently, from servers not responding to deadly lag but there is a team working to resolve the issues. From the trailer released earlier this week, linked below, we can see that the new version about to be released is nowhere near as disappointing as the reboot of 2015. There will be far more maps, vehicle and heroes in the new game but we have yet to see if it will match the fun that was the original. Hopefully it will, but it won't match the current price of $4 on GoG.
"Battlefront 2's online relaunch doesn't seem to be going entirely smoothly for everyone, though. The game's Steam forums are currently clogged with threads such as Online is back but I can t [sic] play it, Can't join any servers, Multiplayer crashing and Insane server lag."
Here is some more Tech News from around the web:
- EA shares lengthy Star Wars Battlefront II trailer @ HEXUS
- How hitting a game cartridge unlocks gaming’s weirdest Easter egg @ Ars Technica
- Europa Universalis IV: Cradle of Civilization announced @ Rock, Paper, SHOTGUN
- Cuphead is the prettiest game to make you throw your controller @ Ars Technica
- Red Dead Redemption 2 trailer introduces its cowboy @ Rock, Paper, SHOTGUN
- Humble Stardock Bundle
- Dawn of War 3 removes Skulls, unlocks doctrines & elites @ Rock, Paper, SHOTGUN
- Project CARS 2 PC & VR performance evaluation – the Red vs. the Green Team @ BabelTechReviews
- Battlestar Galactic Deadlock is the sci-fi strategy game I’d have killed for in 2005 @ Rock, Paper, SHOTGUN
- Vintage spaceship FPS-RTS Allegiance free on Steam @ Rock, Paper, SHOTGUN
Subject: General Tech, Processors | October 4, 2017 - 01:05 PM | Jeremy Hellstrom
Tagged: amd, ryzen, price cuts
AMD is slashing prices on their Ryzen line of CPUs, and not just in the UK. A Ryzen 7 1800X in the US will cost you only $400 if you skip out on the Wraith cooler, or $500 if you are in Canada. If that is a little too rich a 1700X is $295 or $415 in Canada, though the 1700 with Wraith cooler at $370 might be a better deal. The price cuts come just before the launch of Intel's Coffee Lake processors so you might want to wait a day or so for reviews to appear. The price cuts could also signal AMD's desire to move stock before the launch of Pinnacle in a few months.
Wasn't that much more pleasant than finding out the IRS plans to crowd source their tax fraud investigations by awarding a $7m contract to Equifax who can count on everyone who grabbed your leaked personal information to do their work for them?
"They also coincide with rumours that AMD plans to launch a new series of Ryzen parts in February, based on 12nm process technology. The AMD Ryzen ‘Pinnacle' parts will be part of a shift of both CPUs and GPUs to GlobalFoundries 12nm LP [leading performance] process during 2018."
Here is some more Tech News from around the web:
- More Than 80 Percent of All Net Neutrality Comments Were Sent By Bots, Researchers Say @ Slashdot
- Dawn of Solar Age Declared as PV Beats All Other Forms of Power @ Slashdot
- Microsoft shows off Windows 10 Second Li, er, Mixed Reality @ The Register
- Got a Yahoo account? Yeah, you got hacked @ The Inquirer
- Azure fell over for 7 hours in Europe because someone accidentally set off the fire extinguishers @ The Register
- The HUAWEI Kirin 970 Deep Dive Tech Report @ TechARP
Subject: General Tech | October 3, 2017 - 10:00 PM | Scott Michaud
Tagged: VR, Samsung, pc gaming, microsoft
The upcoming Fall Creators Update will be Microsoft’s launch into XR with headsets from a variety of vendors. You can now add Samsung to that list with their Odyssey VR headset and motion controllers, which is important for two reasons. First, Samsung has a lot experience in VR technology as they lead the charge (with their partner, Oculus) in the mobile space.
Second, and speaking of Oculus, the Samsung Odyssey actually has a higher resolution than both it and the HTC Vive (2880x1600 total for Samsung vs 2160 x 1200 total for the other two). This doesn’t seem like a lot, but it’s actually 77% more pixels, which might be significant for text and other fine details. The refresh rate is still 90 Hz, and the field of view is around 110 degrees, which is the same as the HTC Vive. Of course the screen technology, itself, is AMOLED, being that it’s from Samsung and deeper blacks are more important in an enclosed cavity than brightness. In fact, you probably want to reduce brightness in a VR headset so you don’t strain the eyes.
According to Peter Bright of Ars Technica, Microsoft is supporting SteamVR titles, which gives the platform a nice catalog to launch with. The Samsung Odyssey VR headset launched November 6th for $499 USD.
Subject: Systems | October 3, 2017 - 05:42 PM | Jeremy Hellstrom
Tagged: zotac, zbox, Magnus EN1080K, GeForce GTX 1080, i7-7700, SFF, water cooler
The newest Zbox from Zotac is also the most powerful one they have made, which does make it a bit of a different beast than other Zotac SFF products. With an i7-7700 paired with a GTX 1080, along with 16GB of DDR4-2400 and a WD Black 512GB NVMe M.2 SSD the Magnus offers more power than you find in many a mid-range system. The heat produced in the tight confines of the system, 8.9x8x5" (23 x 20 x 13cm), is handled by a custom built watercooling system which cools both the CPU and GPU. This does make the system significantly larger than previous Zbox products and it is much more power hungry, with two power adapters required to run it. The Tech Report loved the performance but did encounter some significant issues with the Zbox, which they overcame with quick and effective support from Zotac. Check this one out for the impressive build design as well as it's impressive gaming abilities.
"Zotac's Zbox Magnus EN1080K pairs Intel's Core i7-7700 CPU with a GeForce GTX 1080 graphics card in an impressively dense liquid-cooled package. We ran some of our favorite games on this system to see how it stacks up in the small-form-factor pantheon."
Here are some more Systems articles from around the web:
- Return To The Asus Tinker Board: Have Six Months Changed Anything? @ Hack a Day
- Upgrade My PC Please! Ep 5: Dem Tings Wit Graphics @ Techspot
- Pairing CPUs and GPUs: PC Upgrades and Bottlenecking @ Techspot
Subject: General Tech | October 3, 2017 - 01:16 PM | Jeremy Hellstrom
Tagged: asus, ASUS ROG, strix, GL753VD, gaming notebook, gtx 1050
There are some things to like about this ASUS ROG Strix laptop, the Core i5 7300HQ with up to 12GB of DDR4 is nothing to sneer at and the inclusion of an M.2 SSD and USB 3.1 Type-C port will be appreciated. On the other hand the 17.3" IPS display has a 1080p resolution and it is powered by a GTX 1050 which is simply not enough to power a VR headset. The price is around $1000, making it more affordable than many gaming laptops but as Kitguru points out, by sacrificing the IPS display for a TN you can choose from a variety of models which house a GTX 1060. You can see the full series of benchmarks they performed here.
"Unfortunately, though the ROG Strix GL753VD has the tagline “gaming without limits”, its relatively low-end Nvidia GTX 1050 graphics chip makes it likely that those limits will crop up rather sooner than the average gamer might like, especially in demanding titles. So can the rest of the package and its overall price still convince?"
Here are some more Mobile articles from around the web:
More Mobile Articles
- The Microsoft Surface Pro 2017 Hands-On Preview @ Tech ARP
- macOS 10.13 High Sierra vs. Ubuntu Linux Performance @ Phoronix
- Apple Watch Series 3 review: LTE comes with high monetary and mental costs @ Ars Technica
- iPhone 8 and 8 Plus review: The curious case of the time-traveling phone @ Ars Technica
Subject: General Tech | October 3, 2017 - 12:41 PM | Jeremy Hellstrom
Roku announced updated hardware and software today, which The Register linked to here. The Roku Express and Roku Express+ have had a speed upgrade, with an increase in responsiveness of five times while the Streaming Stick and Streaming Stick+ now come with a voice activated remote, the higher end model will be capable of 4K and 4K HDR up to 60fps. The top end Roku Ultra offers the same features as its predecessor but now at a lower price.
More interesting is the software update, Roku OS 8 now lets you search over the air TV in its menu if you have an antenna configured and if you have an account with providers such as Dish, Cox or AT&T you will be able to access them on your Roku. You should expect to see the update become available for existing Roku devices as well as these new models later this month.
"The company is still leading the streaming media box market, in large part because it is simply better and offers more than its main competitors in AppleTV, Amazon's Fire TV, and Google's Chromecast."
Here is some more Tech News from around the web:
- How OpenBSD and Linux Mitigate Security Bugs @ Linux.com
- Nvidia partners with Gigabyte and Leadtek for AI, says paper @ DigiTimes
- John McAfee finally reveals Sentinel anti-hacking system @ The Inquirer
- Microsoft extends its Skype-on-a-bike redesign to Linux users @ The Inquirer
- Shortages of all-screen displays to continue into 2018 @ DigiTimes
- Smart burglar alarms: Look who just tossed their hat into the ring ... It's, er, Ring @ The Register
Subject: General Tech | October 3, 2017 - 12:00 AM | Tim Verry
Tagged: usb 3.2, usb-if, USB 3 Type-C, type c, usb
The USB Implementers Forum recently published and made official the specifications for the USB 3.2 standard first introduced in near-final form by the USB 3.0 Promoters Group back in July. The USB 3.2 standard specifies the physical and logical techniques for transferring data over physical USB cables (which are now all specified under their own standards decoupled from the USB 3.2 data transfer specifications) at up to 20 Gbps (~2 GB/s) using two 10 Gbps channels and the same signaling and 128b/132b encoding used by USB 3.1.
Like Thunderbolt, USB 3.2 takes advantage of multiple lanes to achieve the total bandwidth rather than trying to clock and run a single channel at twice the speed which is incredibly complex. In the case of USB 3.2, the specification defines two channels that can run 2 x 5 Gbps or 2 x 10 Gbps depending on the cable used with USB 3.1 Gen 1 (5 Gbps) or USB 3.1 Gen 2 (10 Gbps) cables respectively. In fact, users will be able to re-use their existing USB Type C cables to connect USB 3.2 hosts to USB 3.2 devices so long as they are up to spec. The USB-IF is able to achieve this by using the extra wire pairs in the Type C cables to enable the two lane operation. (5 Gbs cables would be upgraded to 10 Gbps speeds and 10 Gbps cables would be upgraded to 20 Gbps speeds when used with 3.2 hardware at both ends.)
The specification is expected to be finalized by the end of the year with USB 3.2 controllers and other hardware to begin production and roll outs in 2018. Devices supporting the faster USB 3.2 standard are expected as soon as 2019. Desktop users should get access first in the form of PCI-E add-on cards with new USB 3.2 controllers from third parties with native CPU and chipset support from AMD and Intel following in a generation or two (processor generation that is). Laptop and mobile users will have to wait until at least 2019 if not later for the faster standard to come standard.
It is interesting that they have decoupled the USB data transfer standard from the physical cable standards. It seems that USB Type C cables are the star of the show, but that cables like Type A and Micro cables are not going away and could be used with USB 3.2 with the caveat that you would need to buy new USB 3.2 cables which should be backwards compatible with older USB standards but current cables (SuperSpeed Type C cables being the exception) aren't forwards compatible--they might work but will support the higher speeds. At least that is my understanding of it. I am curious if Type C will be more prevalent with USB 3.2 or if we will still see motherboards with a single USB Type C nestled among many more Type A ports. I suppose the number of Type C vs Type A ports will all depend on how many new devices adopt Type C as the USB 3.2 physical interface of choice though, something we will just have to wait and see on! It is nice to see some competition for Thunderbolt though even at 20 Gbps USB 3.2 still lags behind the 40 Gbps of Thunderbolt 3 (20 Gbps with passive copper cables) which Intel is allegedly planning to make royalty free next year. USB 3.2 also has more overhead and is less ideal for things like external graphics. On the other hand, it may just be the cheap enough and fast enough connector that will get the design wins while Thunderbolt continues to be more of a prosumer and professional interface for the higher end and expensive motherboards, PCs, and end devices.
If you are interested in the new 20 Gbps USB 3.2 specifications, the USB-IF has provided a 103 MB zip file with several documents including a 548 page PDF of the new standard and a redline comparison between it and USB 3.1 among other related documents for developers.
Subject: General Tech | October 2, 2017 - 04:12 PM | Tim Verry
Tagged: X399, Threadripper, nvme raid, NVMe, amd, 960 PRO
A recent support page and community update posting suggest that NVMe RAID support is coming to Threadripper and the X399 platform imminently (as soon as motherboard manufacturers release an updated BIOS/UEFI). AMD will support up to six NVMe drives without adapters in a RAID 0, 1, or 10 array with all the drives wired directly to the PCI-E controller in the CPU rather than being routed through the chipset (meaning no DMI bottlenecking). There are no limits on the brand of drives and the NVMe RAID update is free with no hardware or software keys needed to unlock it.
NVMe SSDs are very fast on their own, but when combined in a RAID array directly wired to the CPU things really get interesting. AMD claims that it saw read speeds of 21.2 GB/s when reading from six Samsung 960 Pro 512 GB drives in a RAID 0 array! The company also saw near perfect scaling with their test array (when adding up to six drives over a single drive) with reads scaling 6x on reads and 5.38x on writes. Intel's VROC seems to have the theoretical performance advantage here with the ability to RAID more total drives (four per VMD and three VMDs per CPU) but only after purchasing a hardware key and when using more than one VMD it can't be a bootable OS array. When it comes to bootable arrays, AMD would appear to have the upper hand with free support for up to six drives that can be used to run your bootable OS array! Windows has never booted faster! (heh)
Along with its partners releasing BIOS updates, AMD is releasing updates to its NVMe RAID Driver (version 17.50) and RAIDXpert2 Windows management ultility. Currently, Windows 10 x64 build 1703 is officially supported and fresh installs of Windows are recommended (and if you are currently running your Windows OS off of a RAID array a fresh install is required).
Once BIOS updates are available (and they are coming shortly), users will have to jump through a few hoops to get a NVMe RAID up and running, but those hoops may just be worth it for enthusiasts wanting the best storage performance! For one, if you have a RAID array (bootable or not) you will not be able to do an in-place upgrade. If you have a SATA RAID you must back up your data and break down the array before updating the UEFI/BIOS and installing the Windows driver. Further, if your existing array is bootable with your operating system installed on it you will need to back up your data, upgrade the BIOS, and perform a fresh install of Windows with the AMD supplied F6 driver. After upgrading the BIOS, there will be a new menu item (the exact name will vary by manufacturer but SATA Mode and SATA Configuration are likely suspects) where users will need to change the mode from SATA or AHCI to RAID.
Oh, and did I mention to back up your data before diving into this? NVMe RAID support for Threadripper is a long-awaited feature and has a lot of promise with Threadripper offering up 64 PCI-E lanes and, according to AMD, many boards offering 7 slots (6 with a graphics card) which is where AMD is getting the six drive support number. It is appears that using adapters like the Asus Hyper M.2 cards or DIMM.2 slots would allow users to go past that six drive limit though.
NVMe RAID support on X399 / Threadripper is a feature we are in the process of testing now (see comments) and I am very interested in what the results are! Stay tuned for more information as it develops!
- Finally figured out why THREADRIPPER has so many PCIe lanes (en) [VIDEO] @ der8auer
- Intel VROC Tested! - X299 VROC vs. Z270 RST, Quad Optane vs. Quad 960 PRO
- ASUS X299 Enables Intel Virtual RAID on CPU - RAID-0 up to 20 SSDs!
- Triple M.2 Samsung 950 Pro Z170 PCIe NVMe RAID Tested - Why So Snappy?
Subject: General Tech | October 2, 2017 - 03:38 PM | Jeremy Hellstrom
Tagged: audio, Grado, PS2000e
Grado has a well deserved reputation for providing quality headphones, a reputation which may be somewhat imperilled by a £1000 ($1328USD) price increase compared to last years flagship model. They are certainly pretty, with a smoky chrome exterior, maple reinforcing the interior and padding on the earcups that is replaceable for those looking for custom comfort. The drivers offer the same response range of 5-50,000Hz that the PS1000e did and you an audio player or cellphone can power them, but with a 32 ohms impedance you are better off with a dedicated headphone amp. Kitguru were very impressed with the quality, but it is up to you to decide if the price hike can be justified.
"Unfortunately headphone prices seem to be rising in recent years and the PS2000e are no exception – they hit the UK market at a staggering £2,700 retail. Interestingly, the previous Grado PS1000e flagship is £1,700 — or £1000 less."
Here is some more Tech News from around the web:
- Rosewill Nebula GX50 @ techPowerUp
- Sennheiser GSP 350 Headset @ Kitguru
- HyperX Cloud Revolver S @ techPowerUp
- Jabra Elite Sport True Wireless Earbuds Review @ NikKTech
Subject: General Tech | October 2, 2017 - 02:35 PM | Jeremy Hellstrom
Tagged: input, gaming mouse, corsair, glaive rgb, PMW3360
The Corsair Glaive RGB gaming mouse is focused on comfort, to that end they provided three different thumb rests, one smooth and slightly-curved, a textured one with a more pronounced curve and a textured, almost flat rest. The five programmable buttons include two unique thumb buttons, much larger than other mice and set fairly high up; after some mental adjustments The Tech Report found themselves pleased with that arrangement. The mouse uses a PixArt PMW3360 optical sensor with five customizable DPI levels, up to a maximum 16,000. The scroll wheel was not quite up to the standard they expect from Corsair but was still acceptable, all in all TR have no problems recommending this mouse.
"Corsair's Glaive RGB breaks with Corsair's angular-and-aggressive mouse-shape tradition by adopting a pleasantly rounded chassis that has the potential to be a crowd-pleaser. We took the Glaive to the mat to see how it games."
Here is some more Tech News from around the web:
- Cooler Master MasterSet MS120 Keyboard+Mouse Combo @ Modders-Inc
- Gigabyte Aorus K7 keyboard and Aorus M3 mouse @ Guru of 3D
- ROCCAT Isku+ Force FX Gaming Keyboard @ Modders-Inc
- iKBC F108 RGB Keyboard @ techPowerUp
- SteelSeries Apex M750 Keyboard @ TechPowerUp
- Patriot Viper V770 RGB Mechanical Gaming Keyboard Review @ NikKTech
Subject: General Tech | October 2, 2017 - 12:44 PM | Jeremy Hellstrom
Tagged: microsoft, windows, apple
TechSpot posted an article compiling a variety of tips on making Windows and MacOS do what you want as well as numerous applications you can use for a variety of tasks. The recommendations run from the classic obfuscated Windows "God Mode" folder which contains links to the majority of the tools you can use on your system to basic keyboard shortcuts. If you are trying to figure out where all your storage space went, Space Sniffer for Windows or GrandPerspective for Macs will help you far more than random searches for large folders. You will probably already know a great number of these tips but it is nice to have a long list compiled in a single location.
"Many hardcore computer users might consider themselves above learning new tricks, but there are always new ways to sharpen your skills on the PC and we bet that you will find at least one useful thing here that you didn't know before."
Here is some more Tech News from around the web:
- IT component shortages could worsen during holidays @ DigiTimes
- Google quietly ditches NFC device unlocking in Android because of 'low usage' @ The Inquirer
- Russian Defense Company Demos A One-Person Flying Car @ Slashdot
- Apple Mac fans told: Something smells EFI in your firmware @ The Register
- iPhone X release date, specs and price: Samsung to earn £80 from every handset sold @ The Inquirer
Subject: Systems | October 1, 2017 - 10:41 PM | Scott Michaud
Tagged: amd, Adobe
AMD is bundling two months of Adobe Creative Cloud with a selection of PCs, ranging from Ryzen-enabled desktops to APU-enabled laptops and all-in-ones. There are also two other bundles available: ~$40 of Square Enix Collective indie games, and a 3-month subscription to Adobe Creative Cloud Photography (Photoshop and Lightroom). This started a couple of months ago, but it looks like AMD didn’t make too big of a deal out of it, so I decided that a late post is better than none.
As for the value of the deal? Eh. Two months of Creative Cloud is quite good from a value perspective -- about a hundred US dollars. That said, a free trial from Adobe is a month, and that's on a per-application basis, so you could also see it as “just a double-length free trial”. (Possibly a triple-length if you can use the free trial first, then redeem the Creative Cloud subscription for an extra two. I’m not sure, though, but the redemption period ends on December 31st so it might be possible.)
The list of applicable products and OEMs, as well as other terms and conditions, is available on AMD’s website.
Subject: General Tech | September 30, 2017 - 05:57 PM | Scott Michaud
Tagged: pc gaming, lumberyard, amazon
As we mentioned last week, Amazon has been pushing their Lumberyard fork of CryEngine into their own direction. It turns out that much of their future roadmap was actually slated for last Friday, with the release of Lumberyard 1.11.
This version replaces Crytek’s Flow Graph with Amazon’s Script Canvas visual scripting system. (Think Blueprints from Unreal Engine 4.) This lets developers design logic in a flowchart-like interface and attach it to relevant objects... building them up like blocks. Visual scripting is one area that Unity hasn’t (by default) gotten into, as they favour written scripting languages, such as C#. (Lumberyard also allows components to be written in C++ and LUA, btw.)
It also replaces Crytek’s CryAnimation, Geppetto, and Mannequin with the EMotion FX animation system from Mystic Game Development. Interestingly, this middleware was flying under the radar recently. It was popular around the 2006-2009 timeframe with titles such as Gothic 3, Warhammer Online: Age of Reckoning, and Risen. It was also intergrated into 2010’s The Lord of the Rings: Aragorn’s Quest, and that’s about it as far as we know -- a few racing games, too. I’m curious to see how development advanced over the last ten-or-so years, unless its use is more widespread than they’re allowed to announce. Regardless, they are now in Lumberyard 1.11 as their primary animation system, so people can get their hands on it and see for themselves.
If you’re interested in developing a game in Amazon Lumberyard, this release has basically all of their forward-looking systems in place. Even though a lot of features are still experimental, and the engine is still in beta, I don’t think you have to worry about being forced to develop in a system that will be deprecated at this point.
Lumberyard is free to develop on, as long as you use Amazon Web Services for online services (or you run your own servers).
Subject: Cases and Cooling | September 30, 2017 - 02:26 AM | Tim Verry
Tagged: tower cooler, just delivered, FM2+, cryorig h5 ultimate, CRYORIG, air cooling, air cooler
Just Delivered is a section of PC Perspective where we share some of the goodies that pass through our labs that may or may not see a review, but are pretty cool none the less.
Find the Cryorig H5 Ultimate on Amazon!
I have been slowly rebuilding my wife's desktop PC following a failure of the all-in-one liquid CPU cooler that saw leaking coolant kill the motherboard and power supply (surprisingly the GTX 750 Ti survived despite getting a bunch of coolant on it). I recently replaced the motherboard and PSU (while discovering FM2+ boards are still pretty expensive on eBay) and today got in the last component: a Cryorig H5 Ultimate air cooler. I wouldn't mind replacing the TD03 with another water cooler (it was nice and quiet when it worked), but got a good deal on the air cooler. Anyway, the Cryorig H5 Ultimate is a monster tower style air cooler measuring 168.3 x 143 x 110.9 (HxWxL) with the included fan and weighing 920 grams (2.03 pounds).
I forgot to take an unboxing picture, here is what it comes with though (from Cryorig's website).
The Cryorig H5 Ultimate is rated at 180W TDP and features 38 aluminum fins in an interesting hive / honey comb design that allegedly reduces noise, improves air flow, and strengthens the fin stack. The fins are connected with four 6mm copper heat pipes to the nickel plated C1100 copper baseplate. A 140mm XF140 fan (76 CFM) pushes air through the fin stack spinning anywhere between 700 and 1300 RPM with rated noise levels of 19 to 23 dBA respectively.
There are no LEDs on this monster, but it doesn't need them to look good in my opinion. Fortunately, the fan height is adjustable and you are able to mount the fan on either side of the heatsink which will be important because it can and will interfere with your RAM modules depending on your motherboard and height of RAM heat spreaders! As you will see, I ran into this, but my PC chassis gave just enough clearance that I was able to move the fan up enough to clear the G.Skill RAM (which is on the shorter side). The fan is mounted using two wires and is fairly easy to take off and install.
Cryorig supports both AMD and Intel motherboards (including AM4 with a separate mounting upgrade kit) including FM1, FM2, AM2, and AM3 on the AMD side and LGA 775, 1156, 1150, 1151, and 2011 on the Intel side. The cooler has two mounting kits for AMD and Intel with both requiring you install a backplate.
In my case, I am installing the Cryorig H5 Ultimate on a FM2+ socket motherboard. I had to unscrew the default AMD mounting system and install Cryorig's backplate. There are four screws that screw onto the backplate posts with a slight bit of give which is normal (the backplate will not be tightly screwed to the board, it should be able to move a bit). Then another bracket is screwed onto the backplate screws until hand tight (tighten them using the X method going corner to diagonal corner).
Easy enough so far! However, now here is where I ran into some trouble with the installation. Much like the experience of installing RAM for the first time where you can sometimes feel like you need to use a lot more force than you think you should need to install them, the Cryorig cooler takes quite a bit of force to properly install. Learn from my frustration:
After applying your thermal paste, it's time to install the cooler. You will notice that there are two holes in the top of the cooler and two screw holes in the bracket you installed over the CPU socket. You will line the cooler up so that the spring mounted screws on the cooler are over the holes in the bracket. I found it easiest to put my finger by one of the screws and make sure that screw was lined up, then let down the other side of the cooler so that both screws are lined up. Now, you will need the special screwdriver Cryorig provides in the box. Using one hand push down on the cooler and use the other hand to stick the screwdriver in one of the holes. You will need to keep pressure on the cooler while turning the screw so that it can catch onto the threads in the bracket and start, well, screwing in. Make a few turns so that it is in, but do not fully tighten the screw down. Now, move your hand to the opposite side of the cooler where the other screw hole is and press down. You will need to push this side of the cooler down with quite a bit of force (again, thinking back to the RAM example, don't hulk smash anything, but don’t' be too gentle either). While keeping pressure on this side to hold it towards the socket, start screwing down this side of the cooler. (If you did it right the other side won't pop out, if you didn't screw the first side down enough it might pop out and you'll have to start over heh) Once both sides are partially in, just alternate screwing the screws down until they are hand tight.
Trust me, you might think you are going to break this thing or bend something, but it's just normal SOP. Finally, plug in the XF140 fan into the CPU_fan header and you're good to go!
The Cryorig H5 Ultimate dwarfs my GA-F2A78M-HD2 mATX motherboard leaving just enough room for two memory DIMMs and the graphics card! Heck it still looked huge installed in the old A88X ATX board!
Since installing it I have been playing around a bit with the PC trying to get some temperature readings for you, but am discovering that getting accurate temperature readings from AMD processors (especially older APUs) is not that easy. I am still testing things out and looking into overclocking, but best I can tell the Cryorig cooler is keeping the AMD A8 5600K processor somewhere around 55°C under load using AIDA64 stress testing. At idle the cooler is very quiet and while it does ramp up under load it is barely audible compared to the case exhaust fan! This is not a formal review but so far it has been an interesting cooler assuming you can find it at a good price.
If you are interested in a monster cooler like this, definitely double check your case and RAM clearances though. The install was not too bad the second time around (I first installed it on her old motherboard not knowing if it was dead yet as I did not have another cooler to test), but it is not as easy on this AMD FM2+ socket as their video (and others I found on YouTube) makes it look for the intel platform! With the knowledge that you can and need to use force to press it down to get the screws in it's a fairly quick install, I just wish that information was better spelled out in the instructions as it would have saved me a ton of time the first go around! I don't have formal noise or temp numbers as I am just starting to test it, but so far, I am happy with it.
Subject: General Tech | September 29, 2017 - 08:12 PM | Scott Michaud
Tagged: bethesda, fallout, pc gaming
Until tomorrow (Saturday, September 30th, 2017) at 11:59pm PDT, Steam and Bethesda have the original Fallout game for free. Like EA’s On the House promotion through Origin, this is not just a free weekend. If you “install” the game from the Steam store (although you only need to “install” it to your library -- you can actually install it whenever you like) then it is, apparently, yours to keep.
Honestly, whenever I get around to it, this would be my first time playing the game, too. Back in the 1997 time frame, I was mostly playing games like Command & Conquer (including Red Alert). I never really got into RPG games, be it Western or Eastern. But, due to the wonders of PC gaming, I can just go back and, you know, play whatever I missed... because old games can be awesome, too.
Okay, I’m rambling. Add it to your Steam library if you haven’t already (and, of course, if you have a Steam account).
Subject: General Tech | September 29, 2017 - 06:19 PM | Tim Verry
Tagged: Z370, overclocking, msi, LGA 1151, eatx, e-atx, coffee lake
In response to a few questions readers have brought up about the NICs on the MSI Z370 Godlike Gaming; this board to features the Killer xTend technology from Rivet Networks we saw at Computex. The three Killer Ethernet ports and Killer WiFi allow you to use your PC as both a network switch and a WiFi extender. Several of GIGABYTE's AORUS Gaming motherboards will also feature this technology.
*******************Now back to your regularly scheduled PR******************
MSI is entering the Z370 motherboard fray with two flagship boards the ATX MSI Z370 Gaming Pro Carbon AC and the E-ATX Z370 Godlike Gaming. The latter board takes Z370 to the extreme with more power phases, cooling, expansion, and, of course, RGB LEDs!
The massive motherboard features a massive digital power delivery with solid aluminum heatsinks to keep them cool as well as show off RGB bling. MSI did not specify how it has divided up the phases or the number, but there’s as many as 18 power phases (in reality likely less). Power inputs include both an 8-pin and 4-pin EPS connections along with the standard ATX 12V 24-pin and a 6-pin connector to supply extra power to the PCI-E slots. There are four steel shielded DDR4 DIMM slots with dedicated digital PWM power delivery supporting up to 64 GB at 4133 MHz.
The Z370 Godlike Gaming further features four steel reinforced PCI-E x16 slots, a single PCI-E x1 slot, and three M.2 (key M) slots (using the included PCI-E riser card you can get two extra M.2 slots). On the traditional storage front, the motherboard has six SATA 6 Gbps and one U.2 port. RGB support comes in the form of MSI’s own “Mystic Light” technology that includes on board LEDs as well as a header for RGB strips (and MSI’s site shows the board comes with a Phanteks branded RGB strip) that can be controlled with software. As far as cooling there are headers for a CPU fan, water pump, and eight system fans.
MSI is using a Killer 1535 chip for 802.11ac Wi-Fi (2x2) as well as three Killer E2500 Gigabit Ethernet NICs. Audio is handled by “MSI Audio Boost” which is two Realtek ALC 1220 based EMI shielded audio processors along with an ESS DAC and amplifier with gold plated audio jacks (including a ¼” jack for high end headphones). MSI claims the LED bordered isolated power audio design includes separate PCB layers for the left and right audio channels and high end WIMA and Nichicon capacitors.
Around back the MSI Z370 Godlike Gaming includes:
- 2 x Wi-Fi antenna connections
- 1 x PS/2
- 2 x USB 2.0
- 1 x USB 3.1 Gen 2 Type-C
- 1 x USB 3.1 Gen 2 Type-A
- 6 x USB 3.1 Gen 1 Type-A
- 3 x Gigabit Ethernet (Killer E2500)
- 7 x Audio
- 5 x 3.5mm
- 1 x 6.35mm
- 1 x S/PDIF
Users can get additional USB 3.1 ports using internal headers powered by ASMedia ASM3142 and ASM1074 chipsets (Gen 2 and Gen 1 respectively).
Retail versions of the motherboard should come with a PCI-E riser card with two M.2 slots, headphone adapter, custom sleeved SATA cables, three I/O backplates, three 2-pin temperature probes, a SLI bridge, and a 400mm LED strip.
I am interested in this board from an overclocking perspective as the beefy power phases and additional CPU power from the 8+4 pin connectors should allow for some extreme overclocking fun to be had and enable higher everyday stable overclocks as well. This board has just about everything you could want from a high-end motherboard (except Intel NICs, 10 GbE, and Thunderbolt but you can't have everything!), but it is sure to come at a hefty premium. MSI is not yet talking pricing or availability though unfortunately.
In other Z370 news:
- ASUS Reveals Entire Z370 Lineup
- GIGABYTE Introduces Z370 AORUS Motherboards
- Gigabyte Teases Z370 AORUS Gaming 7 Motherboard
Subject: Storage | September 29, 2017 - 03:11 PM | Jeremy Hellstrom
Tagged: adata, SD700, portable storage, ssd
The SD700 comes in 256GB, 512GB, and 1TB varieties and oddly enough the black models cost $5 less than the tent caterpillar gut coloured ones. The drives are USB 3.1 Gen 1, with transfer speeds hitting the expected rate in The Tech Reports testing; still a far cry from a Gen 2 drive however. The pricing is attractive at only a tiny bit more than an internal drive of the same capacity, however there is no way an internal SSD would stand up to the abuse which the SD700 shrugs off. If you are in need of large sized external storage than can be tossed into a bag and forgotten about until it is needed then check this review out.
"Adata has put together an external version of its SU800 SSD. Clad in water- and shock-resistant rubber and plastic, the SD700 wants to be the portable drive of choice for the active data hoarder. Join us as we see how well it can handle our test suite."
Here are some more Storage reviews from around the web: