All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Displays | December 27, 2018 - 07:21 PM | Scott Michaud
Tagged: just delivered, Samsung, qled
Just Delivered / Just Picked Up is a series of posts where we talk about things we have recently purchased. Think of it like a mini-review for first impressions.
A major goal of my current upgrade cycle was to finally buy a high-end desktop monitor.
In the end, I decided to go for the 27-inch Samsung CHG70 QLED Gaming Monitor. It’s listed as HDR although that is at a typical brightness of 350 cd/m2. It supports FreeSync 2 although I have NVIDIA graphics. Its native resolution is 1440p although that is at 144 Hz, and I put a lot of value into high refresh rates.
That arm is a bit... unnecessary.
Thankfully, it's entirely unnecessary if you wall mount.
A weird design decision is its stand – it’s way too deep. It has a bit of a crane shape, versus a vertical slider like my BenQ, so it eats about 13 inches. There was barely enough room for my keyboard in front of the monitor, and my desk is 23 inches deep (plus an extra inch between it and the wall). If you can wall-mount it, which it is capable of, then that’s a complete non-issue. In fact, the entire stand can be completely removed if you intend to wall mount it, which is nice.
In terms of color? It’s beautiful.
Of course, one of the first things I do is go onto YouTube and look at the various videos with highly-saturated colors and deep blacks. It looks a bit blown out in some bright scenes, almost like its gamma is off, although my current calibration effort is limited to “set in Cinema mode”. I’ll need to play around with it someday.
Subject: General Tech | December 26, 2018 - 10:42 PM | Jeremy Hellstrom
Tagged: input, topre, REALFORCE, realforce r2, TKL
Fans of rare imported mechanical keyboards crafted by Topre have a friend in Fujitsu whom now import them to North America. The secret sauce which attracts people to these keyboards are the electrostatic key switches Topre uses which are very different from Cherry style mechanical switches or the discount dome rubber style. That does mean that you won't be able to swap keycaps without ordering them directly from Topre. Drop by The Tech Report to see if Topre deserves the glamour which surrounds their name.
"Topre's Realforce keyboards are legendary for their quality and durability, and now Fujitsu Computer Products of America is officially distributing these boards on American shores. We got a tenkeyless Realforce R2 board under our fingers to see whether it lives up to the Topre mythos."
Here is some more Tech News from around the web:
- Cherry KC 6000 Keyboard and MC 4900 Mouse @ Guru of 3D
- CORSAIR K70 RGB MK.2 Low Profile Mechanical Gaming Keyboard Review @ NikKTech
- HyperX's Alloy FPS RGB @ The Tech Report
- Fnatic Gear Flick 2 @ TechPowerUp
Subject: Processors | December 26, 2018 - 11:52 AM | Jeremy Hellstrom
Tagged: overclock, 200GE, amd, msi, b350, b450, AM4
If you happen to have an MSI B450 or B350 motherboard, get out there and grab the latest UEFI BIOS which updates support for AGESA version 220.127.116.11 as it may be pulled soon. The reason it may not last is because it will let you overclock your Athlon 200GE processor, something which is generally impossible to pull off. TechSpot tried it out successfully on a variety of MSI boards, such as the Gaming Pro Carbon AC and managed to bump the $55 processor from 3.2GHz to 3.8GHz. You won't see a huge increase in performance, though you will see some and it makes for an interesting experiment.
"In an unexpected turn of events, it's now possible to overclock the otherwise-locked $55 Athlon 200GE processor. In what appears to be a slip up by MSI, the component maker has enabled Athlon overclocking with their latest BIOS release across its entire AM4 motherboard lineup."
Here are some more Processor articles from around the web:
- AMD Athlon 200GE @ Guru of 3D
- Battlefield V Multiplayer CPU Benchmark: Ryzen 7 2700X vs. Core i9-9900K @ Techspot
- Intel Core i9-9980XE Extreme Edition Review – It Hertz! @ Kitguru
- Windows Server 2019 vs. Linux vs. FreeBSD Performance On A 2P EPYC Server @ Phoronix
Subject: General Tech | December 25, 2018 - 03:26 PM | Jeremy Hellstrom
There are many positive reasons to game on Christmas, perhaps killing time before the next time you baste the turkey and yourself, to show off the newly unwrapped hardware you got or perhaps a family deathmatch tourney to determine who gets stuck washing the dishes. Steam have fired up their Winter Sale, with their own special advent calendar you can check out even if you don't pick anything up.
If you are more the bah humbug type, you should hang out with EA. There have been rumours stemming from a since deleted post which suggests they were only kidding about learning their lessons from Star Wars Battlefront II as microtransactions may be about to arrive in Battlefield V. After all, who doesn't want to pay $50 of real money for 6000 BCoins? On the other hand they may be worth more than the other B type Coins in the near future.
Here is some more gaming news from around the web:
- OCC Reviews GRIS
- Just Cause 4's first big patch is "just the start of the work" planned to fix it @ Rock, Paper, SHOTGUN
- OCC Reviews Just Cause 4
- Final Fantasy XV DLSS versus TAA IQ and Performance Analysis at BabelTechReviews
- Wot I Think: Fallout 76 @ Rock, Paper, SHOTGUN
- Book of Demons @ TechPowerUp
- Steel Shadows @ TechPowerUp
- Ars Technica’s best games of 2018
Subject: Graphics Cards | December 24, 2018 - 03:19 PM | Jeremy Hellstrom
Tagged: sea hawk, RTX 2080, overclocking, msi
[H]ard|OCP takes a look at MSI's Sea Hawk RTX 2080, which sports a GPU covered by an AiO watercooler as well as a blower fan to ensure the memory and VRM are actively cooled as well. The design of the cooler also slims the card so you don't need to worry about the spacing between your PCIe slots as with some other coolers. Without any work whatsoever, you can expect an average 1954MHz GPU clock, 2040MHz with a bit of a power boost or 2060MHz if you don't mind the noise produced by fans spinning at 100%. The VRMs did prove a little finicky as you can see in the full review.
"MSI sent over its new Sea Hawk RTX 2080 card for use in a build video. This is a fair simple RTX card build that is purchased with a pre-installed All-In-One cooler. We wanted to see how well it overclocked and spent a night of gaming in order to do that and we have to say we were pleased with our results."
Here are some more Graphics Card articles from around the web:
- MSI GeForce RTX 2080 Duke 8G OC @ Modders-Inc
- ZOTAC GAMING GeForce RTX 2080 Ti AMP Extreme @ Guru of 3D
- ZOTAC RTX 2080 AMP Extreme Video Card Review @ Hardware Asylum
- ASUS GeForce RTX 2070 STRIX OC 8 GB @ TechPowerUp
- Zotac Gaming RTX 2070 OC Mini 8GB @ Kitguru
- KFA2 GeForce GTX 1060 6 GB GDDR5X @ TechPowerUp
- Initial Linux Benchmarks Of The NVIDIA TITAN RTX Graphics Card For Compute & Gaming @ Phoronix
- OCC NVIDIA RTX 2080 Overclocking Guide
- Battlefield V NVIDIA Ray Tracing RTX 2070 Performance @ [H]ard|OCP
- NVIDIA DLSS Test in Final Fantasy XV @ TechPowerUp
Subject: General Tech | December 24, 2018 - 12:35 PM | Jeremy Hellstrom
Tagged: harmony link, logitech, harmony hub
Logitech have come up with a compromise for Tiny Tim again this year, rolling back their Scrooge like decision to drop all support for third party hardware on their Harmony Hub. If you wish to continue to use XMPP APIs to link in additional hardware you can opt into a beta program which will enable support again, according to The Register. This seems a wise way to increase the security of the Harmony Hub by requiring an extra step to allow you to add in extra devices; those who do not enable the beta can be sure that only vetted IoT devices can connect to their network while the technically inclined can continue to play with their smart home.
Let's hope next year we are not revisiting this once again.
"That solution is "an XMPP beta program, which will allow access to local controls." In effect, the company has written in a hasty workaround that the tech-savvy can tap to get their systems working."
Here is some more Tech News from around the web:
- 3D printing and genetic engineering bring biofilms to life @ Physics World
- AMD vs Intel: Team Red lead the way in chip innovation in 2018 @ The Inquirer
- NVIDIA 'GeForce NOW Recommended Routers' Program Helps Gamers Choose Networking Gear @ Slashdot
- Wi-Fi 6 Explained: The Next Generation of Wi-Fi @ Techspot
- Linux 4.20 Released in Time for Christmas @ Slashdot
- What Ever Happened to ICQ? @ Techspot
Subject: Processors | December 22, 2018 - 12:02 AM | Tim Verry
Tagged: Zen, ryzen, rx vega, athlon, APU, amd, 240GE, 220GE
Today AMD announced the availability of its budget Zen-based Athlon Processor with Vega Graphics APUs and released details about the Athlon 220GE and Athlon 240GE APUs that complement the Athlon 200GE it talked about back in September.
These Athlon 200-series processors are aimed at the budget and mainstream markets to fill the need for a basic processor for everyday tasks such as browsing the internet, checking email, and doing homework. The APUs utilize a 14nm manufacturing process and pair Zen CPU cores with a Vega-based GPU in a 35 watt power envelope, and are aimed at desktops utilizing the AM4 socket.
The Athlon 200GE, 220GE, and 240GE are all dual core, 4-thread processors with 4MB L3 cache and GPUs with 3 compute units (192 cores) clocked at 1 GHz. They all support dual channel DDR4 2667 MHz memory and have 35W TDPs. Where the Athlon APUs differ is in CPU clockspeeds with the higher numbered models having slightly higher base clock speeds.
|APU Model||Athlon 200GE||Athlon 220GE||Athlon 240GE|
|Cores/Threads||2 / 4||2 / 4||2 / 4|
|Base Freq||3.2 GHz||3.4 GHz||3.5 GHz|
|Graphics Freq||1 GHz||1 GHz||1 GHz|
The Athlon 200GE starts at 3.2 GHz for $54.98 with an additional $10 buying you the 3.4 GHz 220GE and another $10 premium buying the $74.98 Athlon 240GE's 3.5 GHz CPU clocks. The Athlon 220GE seems to be the best value in that respect, because the extra $10 buys you an extra 200 MHz and the jump to the 240GE only gets an extra 100 MHz for the same extra cost. (Keep in mind that these chips are not unlocked.) Then again, if you are on a tight budget where every dollar counts, the 200GE may be what you end up going with so that you can buy better RAM or more storage.
The new chips are available now but it seems retailers aren't quite ready with their listings as while the 200GE is up for sale at Amazon, the 220GE and 240GE are not yet listed online at the time of writing.
The Athlon 200GE-series APUs introduce a new lower-end option that sits below Ryzen 3 at a lower price point for basic desktops doing typical office or home entertainment duties. With a 35W TDP they might also be useful in fanless home theater PCs and game streaming endpoints for gaming on the big screen.
I am also curious whether these chips will be used for by the DIY and enthusiast community as the base for budget (gaming) builds and if they might see the same popularity as the Athlon X4 860K (note: no built-in graphics). I would be interested in the comparison between the 4c/4t 860K ($57) and the 2c/4t 200GE ($55) to see how they stack up with the newer process node and core design. On the other hand, enthusiasts may well be better served with the overclockable Ryzen 3 2200G ($97) if they want a budget Zen-based part that also has its own GPU.
What are your thoughts on the new Athlon APUs?
Subject: Mobile | December 21, 2018 - 10:45 PM | Tim Verry
Tagged: windows hello, windows, Samsung, s pen, notebook 9 pen, convertible tablet, convertible, 2-in-1
Samsung is updating its laptop lineup to include the new Notebook 9 Pen which is a 2-in-1 convertible with built-in S Pen that comes in 13.3-inch and 15-inch form factors. Featuring full body aluminum frames, diamond cut edges, and narrow display bezels, the Notebook 9 Pen weighs in at 2.47 pounds and 3.44 pounds for the 13-inch and 15-inch models respectively. The new “Notebook 9 Pen” PCs should not be confused with the existing Notebook 9 Pen notebooks which were released earlier this year. The new models which are slated for a 2019 release introduce a 15” model to the lineup as well as more memory, brighter (500 nits) displays with narrower bezels, and two new colors and designs.
Available in Ocean Blue or Platinum WHite, the Notebook 9 Pen includes a full HD display with very small bezels and a HD webcam paired with a backlit keyboard and decently sized trackpad joined by a 360-degree rotating hinge. The convertible laptop also has dual 5W AKG speakers with ThunderAmp technology. External I/O includes two Thunderbolt 3, one USB-C, one combo headphone/microphone, and one UFS/microSD port. As far as wireless connectivity, the notebook supports 802.ac Wave 2 2x2 WiFi.
The modern I/O is supported by modern internal hardware including up to 8th Generation Intel Core i7 processors, 16GB LPDDR3, and a 512GB PCI-E NVMe SSD. The Notebook 9 Pen with 13.3” display uses Intel UHD graphics, but the 15” model can be equipped with a NVIDIA MX150 GPU with 2GB memory. Both models are powered by a 54 Wh battery that supports fast charging and allegedly offers up to 15 hours of battery life.
Of course, the interesting aspect of the Notebook 9 Pen is the S Pen which Samsung as reportedly improved to be more responsive with up to a 2x reduction in latency to 7ms. The S Pen comes with three different pen tips so that artists can get the feel they want when drawing on the screen. The S Pen can do the usual things its smartphone counterparts can like drawing and writing and it can also be used to control media playback, advance slides, and record voice notes with its built-in microphone.
First impressions look promising, but pricing is going to be key as well as build quality and feel and with this year’s model starting at $1,400 MSRP ($1,000+ on Amazon for the 8GB RAM version) the updated 2019 Notebook 9 Pen isn’t going to be cheap! Unfortunately, exact pricing and availability have not yet been disclosed.
With that said, assuming rewiews hold up, it looks sharp and for artists and designers that like to work on the go it may be worth checking out!
Subject: Graphics Cards | December 21, 2018 - 02:00 PM | Jim Tanous
Tagged: physx 4.0, PhysX, open source, nvidia
As promised in the company's initial announcement earlier this month, NVIDIA has released the newly open-sourced PhysX 4.0 SDK via GitHub. Now, thanks to its 3-Clause BSD license, any game developer, hardware company, or coding enthusiast can grab the latest version of NVIDIA's realtime physics engine and tinker, improve, or implement it in hopefully creative new ways.
The one limitation, of course, is that in its current form PhysX 4.0 (and version 3.4, which is now open source, too) still references lots of NVIDIA's closed source APIs, notably CUDA. But with the PhysX framework now available to fork, there's nothing to stop an eager company or programmer from creating and implementing their own alternatives to NVIDIA's proprietary tech.
In addition to going open source, PhysX 4.0 introduces a number of new features as outlined on NVIDIA's developer site:
- Temporal Gauss-Seidel Solver (TGS), which makes machinery, characters/ragdolls, and anything else that is jointed or articulated much more robust. TGS dynamically re-computes constraints with each iteration, based on bodies’ relative motion.
- The new reduced coordinate articulations feature makes the simulation of joints possible with no relative position error and realistic actuation.
- New automatic multi-broad phase.
- Increased scalability with new filtering rules for kinematics and statics.
- Actor-centric scene queries significantly improve performance for actors with many shapes.
- Build system now based on CMake.
BSD 3 licensed platforms:
- Apple iOS
- Apple MacOS
- Google Android ARM
- Microsoft Windows
Unchanged NVIDIA EULA platforms:
- Microsoft XBox One
- Sony Playstation 4
- Nintendo Switch
Subject: Motherboards | December 20, 2018 - 02:58 PM | Jeremy Hellstrom
Tagged: x299, prime x299-deluxe II, Intel, i9-9980XE, asus
ASUS have refined their Prime X299-Deluxe with a second version, which includes a redesign of the VRMs and power phases that lead to much discussion about the original board. The Tech Report noticed that the change is not as simple as the marketing material would lead you to believe and they go into great detail about that in the beginning of their review. In the end, the changes did result in decent thermals on the VRMs when overclocking and no discernible stability issues. That's not all that is new about the board, mostly good with the exception of the default enabling of Windows Platform Binary Tables to load ASUS' software hub in the UEFI
"Motherboard manufacturers are introducing refined X299 boards in the wake of Intel's Basin Falls platform refresh, and Asus' Prime X299-Deluxe II promises to fix some of the teething pains of its predecessor. We paired this board up with the Core i9-9980XE to see whether it's up to the job of hosting Intel's highest-end desktop chip yet."
Here are some more Motherboard articles from around the web:
Subject: General Tech | December 20, 2018 - 02:41 PM | Jeremy Hellstrom
Tagged: harmony link, logitech, scrooge, harmony hub
It would seem it is time once again for Logitech's Christmas tradition of bricking users Harmony devices, or at least devices connected to it. Last November they got into the spirit by announcing that they had no intention of renewing an SSL/TLS cert which was required for their Harmony Link device to connect to and control other smart home devices. Instead they offered current users 35% off the purchase of the newly released Harmony Hub ... which did not go over well. It quickly escalated to the point where Logitech was censoring the words "class action lawsuit" in its support forums; to prevent spam, of course.
Logitech seemed to have learned their lesson as a few short hours after the story broke, they offered a free Harmony Hub to anyone who owned a Link. Those who took Logitech up on this offer have found themselves fooled once again. A new firmware update has shut out any and all access to third party hardware, previously accessed by XMPP-based APIs, now no longer possible with the new update. Anyone who incorporated devices into their Harmony Hub controlled system will now find those devices unable to connect; one hopes your Christmas displays did not depend on that. The Register has a few choice comments to add here.
"Logitech recently released a firmware update for Harmony hub-based remotes that addressed some security vulnerabilities brought to our attention by a third-party cyber security firm,"
Here is some more Tech News from around the web:
- Chip flinger Micron reels in production, expenses as revenue growth comes to crashing halt @ The Register
- On the first day of Christmas, Microsoft gave to me... an emergency out-of-band security patch for IE @ The Register
- ASUS RT-AC86U Dual Band Gigabit Router @ OCC
- Graphics Card Pricing Update December 2018: Pascal is Running Out of Stock, Radeon Dominates Value Offerings @ Techspot
- Pass the ruddy eggnog! It's the 2018 INQ Christmas gift guide
Subject: Storage | December 20, 2018 - 10:34 AM | Sebastian Peak
Tagged: storage, ram, Optane DC Persistent Memory, Optane, micron, memory, Intel, Hynix, flash, ddr4, 3D XPoint
ServeTheHome got up close and personal with Optane DC Persistent Memory in an article posted yesterday, removing the heat spreaders and taking a look at (and several photos of) the components within.
Intel Optane Persistent Memory DDR4 module, front view (via ServeTheHome)
"We are going to take a 128GB Intel Optane Persistent Memory DDR4 module, and open it up. Until now, Intel Optane DC Persistent Memory has mostly been photographed with its big black heat spreader. We ended up with a handful of modules not from Intel, nor a system provider, but a handful to use."
Among their notes we have this interesting find, as SK.Hynix is the provider of the module's DRAM, rather than Micron:
"On the other side of the module from the Optane controller is a DDR4 DRAM module, this one from SK.Hynix. Model number H5AN4G8NAFR-TFC. We are not sure why Intel would not use a Micron module here since Micron has been the manufacturing partner for 3D XPoint thus far."
Intel Optane Persistent Memory DDR4 module, rear view (via ServeTheHome)
The full article is available here from STH and includes an embed of this video covering their de-lidding and chip exploration process:
Subject: General Tech | December 20, 2018 - 05:01 AM | Jim Tanous
Tagged: SFX PSU, scythe, Samsung, ryzen 3, RTX 2060, podcast, ghost canyon, enterprise ssd, crucial, Corsair PSU
PC Perspective Podcast #526 - 12/19/2018
Our podcast this week looks at some new enterprise SSDs from Samsung, a quiet and capable CPU air cooler from Sycthe, rumors of new mobile GPUs from NVIDIA and AMD, and more!
Subscribe to the PC Perspective Podcast
Check out previous podcast episodes: http://pcper.com/podcast
00:06:27 - Review: Samsung Enterprise SSD
00:28:25 - Review: Scythe Ninja 5 CPU Cooler
00:39:01 - Review: Corsair Platinum SFX PSU
00:43:46 - Review: Crucial P1 SSD
00:55:58 - Rumor: NVIDIA RTX Mobile
00:58:01 - Rumor: NVIDIA RTX 2060
01:01:20 - Rumor: AMD Ryzen 3000 Mobile
01:04:41 - News: AMD Radeon Adrenalin 2019 Edition
01:13:15 - Rumor: Intel Ghost Canyon X NUC
01:16:24 - News: Gigabyte AORUS Xtreme WaterForce Motherboard
01:21:17 - News: JEDEC HBM Standard Updates
01:24:25 - News: Windows Sandbox
Subject: Processors | December 19, 2018 - 08:47 PM | Tim Verry
Tagged: Zen+, ryzen mobile, ryzen, rumor, picasso, geekbench, amd
Twitter user APISAK is at it again with more hardware leaks, and this time the rumors surround AMD's next generation mobile 3000U-series "Picasso" APUs which will replace Raven Ridge in 2019. The new APUs were reportedly spotted by APISAK (@TUM_APISAK on Twitter) as reported by Hexus in two HP laptops in 14" and 17" form factors and offer power efficiency and performance improvements over Raven Ridge's CPU cores along with Vega-based graphics. Searching around online and parsing the various conflicting rumors and speculation on Picasso, I think it is most likely that Picasso is 12nm and utilizes Zen+ CPU cores though it remains to be seen how true that is.
Based on previous roadmaps, AMD's APUs have trailed the desktop CPUs in process technology and architecture instead opting to refine the previous generation for mobile rather than operating at its bleeding edge so while 2019 will see Zen 2 architecture-based CPUs and GPUs built on 7nm, APUs in 2019 are likely to stick with 12nm and Zen+ tuned for a mobile power envelope with tweaks to SenseMI and technology like mobile XFR and dynamic power delivery.
In any event, Picasso APUs are rumored to include the Ryzen 3 3200U, Ryzen 3 3300U, and Ryzen 5 3500U based on Geekbench results pages as well as the low-end [Athlon?] 3000U and the high-end Ryzen 5 3700U - according to the source. The 3000U and 3700U are known in name only, but the middle-tier APUs have a bit more information available thanks to Geekbench. The Ryzen 3 3200U is a dual core (four thread) part while the Ryzen 3 3300U and Ryzen 5 3500U are quad core (eight thread) CPUs. All Picasso APUs are rumored to use Vega-based graphics. The dual core APU has the highest base clock at 2.6 Ghz while the 3300U and 3500U start at 2.1 GHz. The Ryzen 5 3700U allegedly clocks from 2.2 GHz to 3.8 GHz and likely has the highest boost clock of the bunch. The parts use the FP5 mobile socket.
|Athlon(?) 3000U||Ryzen 3 3200U||Ryzen 3 3300U||Ryzen 5 3500U||Ryzen 5 3700U||A10-8700P (Carrizo)||Intel Core i5-8359U|
|Cores / Threads||?||2 / 4||4 / 4||4 / 8||4 / 8||2 / 4||4 / 8|
|Base / Boost Clocks||?||2.6 / ? GHz||2.1 / ? GHz||2.1 / ? GHz||2.2 / 3.8 GHz||1.8 / 3.19 GHz||1.9 / 3.59 GHz|
|Cache||?||4 MB||4 MB||4 MB||4 MB||2 MB||6 MB|
|Graphics||Vega||Vega 3 6 CU (920 MHz)||Vega 6 6 CU (1.2 GHz)||Vega 8 8 CU (1.2 GHz)||Vega||R6 6 CUs (GCN 1.2)||UHD 620 24 CUs (1.1 GHz)|
|Geekbench Single Core||?||3467||3654||3870||?||2113||4215|
|Geekbench Multi Core||?||6735||9686||11284||?||4328||12768|
Looking at the Geekbench results (which you should take with a grain of salt and as just an approximation because final scores would depend on the platform, cooling, and how it ends up clocking within its power envelope) it seems that AMD may have a decent chip on its hands that improves the performance over Raven Ridge a bit and significantly over its older Excavator-based pre-Zen designs. A cursory comparison with Kaby Lake shows that AMD is not quite to par in CPU performance (particularly per core but it comes close in multi-core) but offers notably better compute / GPU performance thanks to the Vega graphics. It seems that AMD is closing the gap at least with Zen+.
I am remaining skeptical but optimistic about AMD's Picasso APUs. I am looking forward to more information on the new chips and the devices that will use them. I am hoping that my educated guess is correct with regard to Picasso being 12nm Zen+ or better as rumor is mainly that Picasso is a Raven Ridge successor that offers power and performance tweaks without going into further detail. I expect more information on Picasso (APU) and Matisse (CPU) to come out as soon as next month at CES 2019.
What are your thoughts on Picasso?
Subject: General Tech | December 19, 2018 - 06:11 PM | Jeremy Hellstrom
Tagged: gaming, memory
TechSpot took a brief look at a wide variety of modern games to see just how much RAM they make use of. With benchmarks run on a system with 8GB, 16GB and then 32GB they give you insight into just how much RAM is enough to handle these games. With the price of memory still high, it is worth considering if it makes more sense to purchase just enough RAM for this generation of games and upgrade as the cost of DIMMs slowly declines. Take a peek to see how much memory your favourite titles make use of.
"Today we're looking into how much RAM you need to play the latest and greatest gaming titles. About this time each year we set on a memory capacity quest and last year's expedition lead us to conclude that for gamers 4GB is out, 8GB was the minimum, 16GB is the sweet spot and 32GB is overkill."
Here is some more Tech News from around the web:
- It's beginning to look a lot like multi-threaded CPUs, everywhere you go... Arm teases SMT Cortex-A65AE car brains @ The Register
- Topological off-on switch could make new type of transistor @ Physics World
- Windows 10 October Update is now finally available to 'advanced users' @ The Inquirer
- Synology MR2200ac Mesh Router @ TechPowerUp
- Enter for a chance to win consoles, smartwatches, and more in the 2018 Ars Charity Drive
Subject: General Tech | December 19, 2018 - 12:00 PM | Sebastian Peak
Tagged: windows insider, Windows 10 Pro, Windows 10 Enterprise, windows 10, windows, VM, virtual machine, microsoft, build 18305
Windows Sandbox is a new virtual machine environment coming to Windows 10 Pro and Enterprise versions in 2019, which will be available as an optional component within Windows. Microsoft details the upcoming feature in a blog post published yesterday, describing it as "a new lightweight desktop environment tailored for safely running applications in isolation".
"How many times have you downloaded an executable file, but were afraid to run it? Have you ever been in a situation which required a clean installation of Windows, but didn’t want to set up a virtual machine?
At Microsoft we regularly encounter these situations, so we developed Windows Sandbox: an isolated, temporary, desktop environment where you can run untrusted software without the fear of lasting impact to your PC. Any software installed in Windows Sandbox stays only in the sandbox and cannot affect your host. Once Windows Sandbox is closed, all the software with all its files and state are permanently deleted."
Microsoft lists these features for Windows Sandbox, outlining the secure and non-persistent "disposable" nature of the environment:
- Part of Windows – everything required for this feature ships with Windows 10 Pro and Enterprise. No need to download a VHD!
- Pristine – every time Windows Sandbox runs, it’s as clean as a brand-new installation of Windows
- Disposable – nothing persists on the device; everything is discarded after you close the application
- Secure – uses hardware-based virtualization for kernel isolation, which relies on the Microsoft’s hypervisor to run a separate kernel which isolates Windows Sandbox from the host
- Efficient – uses integrated kernel scheduler, smart memory management, and virtual GPU
The environment requires a sytem with an AMD64 architecture running Windows 10 Pro or Enterprise build 18305 or later, with the rather slim minimum requirements of just 4GB of memory, 2 CPU cores, and 1 GB of free space (with 8GB RAM, 4 cores, and SSD storage recommended).
The full blog post goes into further detail with a full "under the hood" look at Windows Sandbox, which among other things offers graphics hardware acceleration "with Windows dynamically allocating graphics resources where they are needed across the host and guest".
As to availability, ZDNet's Mary Jo Foley had reported that while the feature was originally "expected to come to Windows 10 19H1 early next year" it could be available to Insider tester as early as this week with Build 18301 of Windows 10 - but this 18301 and earlier 18292 build referenced in Foley's post have apparently been removed from the Microsoft blog post, which now exclusively lists Build 18305.
Subject: Editorial | December 19, 2018 - 05:52 AM | Jim Tanous
Tagged: extra life, charity
The PC Perspective team is joining up with Extra Life this holiday season to raise funds in support of our local children's hospitals, and we want you to join us!
For those unfamliar, Extra Life is a gaming-focused fundraising event in support of Children's Miracle Network hospitals throughout the US and Canada. Participants set a fundraising goal which culminates in a 24-hour gaming marathon, with all proceeds going to support the treatments and care of millions of hospitalized kids each year.
So before the year comes to a close, head on over to our Extra Life team page and make a tax-deductible donation to help us reach our goal of $10,000. Making a donation in someone else's name can also be a great holiday gift, and allows you to avoid the retail store rush for those last-minute shopping trips! And unlike George Costanza, you'll actually be supporting an amazing and worthwhile cause!
We don't yet have a date scheduled for the 24-hour gaming marathon, but we'll let you know as soon as we have that locked down. And, on that same topic, please leave a comment or send us a tweet with some game recommendations that you'd like to see us play during the marathon.
Thank you all so much for your support. Let's smash our goal and together do something great to end the year!
Subject: Graphics Cards, Memory | December 17, 2018 - 04:33 PM | Sebastian Peak
Tagged: Vega, radeon, JESD235, jedec, high bandwidth memory, hbm, DRAM, amd
In a press release today JEDEC has announced an update to the HBM standard, with potential implications for graphics cards utilizing the technology (such as an AMD Radeon Vega 64 successor, perhaps?).
"This update extends the per pin bandwidth to 2.4 Gbps, adds a new footprint option to accommodate the 16 Gb-layer and 12-high configurations for higher density components, and updates the MISR polynomial options for these new configurations."
Original HBM graphic via AMD
The revised spec brings the JEDEC standard up to the level we saw with Samsung's "Aquabolt" HBM2 and its 307.2 GB/s per-stack bandwidth, but with 12-high TSV stacks (up from 8) which raises memory capacity from 8GB to a whopping 24GB per stack.
The full press release from JEDEC follows:
ARLINGTON, Va., USA – DECEMBER 17, 2018 – JEDEC Solid State Technology Association, the global leader in the development of standards for the microelectronics industry, today announced the publication of an update to JESD235 High Bandwidth Memory (HBM) DRAM standard. HBM DRAM is used in Graphics, High Performance Computing, Server, Networking and Client applications where peak bandwidth, bandwidth per watt, and capacity per area are valued metrics to a solution’s success in the market. The standard was developed and updated with support from leading GPU and CPU developers to extend the system bandwidth growth curve beyond levels supported by traditional discrete packaged memory. JESD235B is available for download from the JEDEC website.
JEDEC standard JESD235B for HBM leverages Wide I/O and TSV technologies to support densities up to 24 GB per device at speeds up to 307 GB/s. This bandwidth is delivered across a 1024-bit wide device interface that is divided into 8 independent channels on each DRAM stack. The standard can support 2-high, 4-high, 8-high, and 12-high TSV stacks of DRAM at full bandwidth to allow systems flexibility on capacity requirements from 1 GB – 24 GB per stack.
This update extends the per pin bandwidth to 2.4 Gbps, adds a new footprint option to accommodate the 16 Gb-layer and 12-high configurations for higher density components, and updates the MISR polynomial options for these new configurations. Additional clarifications are provided throughout the document to address test features and compatibility across generations of HBM components.
Subject: General Tech | December 17, 2018 - 04:28 PM | Jeremy Hellstrom
Tagged: audio, hyperx, cloud mix, bluetooth headset
HyperX have avoided Bluetooth headsets, citing the latency inherent in the connection, which obviously haven't hurt them as they have been #1 in the US for a few years now. The Cloud Mix changes this, as it is BlueTooth capable as well as offering 3.5mm connectivity. The specifications look right, 40nm neodymium drivers and reported frequency range of 10Hz – 40,000Hz.
Are they worth a try? Find out over at Legit Reviews.
"Back in October 2018, HyperX released the Bluetooth-enabled Cloud Mix gaming headset. The Cloud Mix came as a bit of a surprise to me. I’ve talked to HyperX many times over the years to inquire about Bluetooth wireless gaming headsets and was always told that the latency of the audio broadcast over Bluetooth introduced too much latency to be considered a good gaming headset. Yet, here was HyperX introducing the Cloud Mix... Read on if you still have an attention span!"
Here is some more Tech News from around the web:
- Edifier W860NB Active Noise Cancelling Headphones @ TechARP
- 1MORE Triple Driver BT In-Ear Headphones Review @ NikKTech
- Edifier S2000 Pro 2.0 Speaker @ Kitguru
Subject: Graphics Cards | December 17, 2018 - 03:24 PM | Sebastian Peak
Tagged: rumor, report, nvidia, leak, GTX 2060, graphics, gpu, geforce, gaming
We've been hearing rumors about a GeForce RTX 2060 since at least August, with screen captures of a reported mid-range Pascal card (then assumed to be "GTX" 2060) - seemingly with GTX 1080 levels of performance - surfacing at that time.
Then in November there was the reported Final Fantasy XV benchmark leak, showing performance a little below a GTX 1070 with the game running at 3840x2160 (high quality preset) - but this was possibly the mobile skew according to leaker APISAK on Twitter.
A week or so ago we saw an image of a Gigabyte card from VideoCardz.com which the site said was the RTX 2060:
Image via VideoCardz.com
"Our sources at Gigabyte have confirmed GeForce RTX 2060 graphics card launching soon. The card features TU106 GPU with 1920 CUDA cores and 6GB of GDDR6 memory. The model pictured below is factory-overclocked, but the exact clock remains unconfirmed." (Source: VideoCardz)
It seems fair to assume that a launch is imminent, with reports of a potential announcement the second week of January which may or may not coincide with CES 2019. As to final specs and pricing? Let the speculation commence!