All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | November 29, 2016 - 06:23 PM | Jeremy Hellstrom
Tagged: megaprocessor, DIY, neat
The Megaprocessor is a working CPU which is blown up in size to allow you to walk into it to watch how data is physically processed with your own eyes. There are 8,500 LED's in the core and another 2,048 for the memory which light up as data passes through the 15,300 transistors in the core and the memory's 27,000; though that total count includes the transistors which control the LEDs. The core's clock is a staggering 25kHz and there is 256 bytes of both RAM and ROM. The site actually provides you with the assembly language to write code for the processor if you are interested and you can visit the Centre for Computing History in Cambridge, England to see it in person. Drop by The Register for a quick look and for links to the project page for more details on the computer and build process, including a murderous vacuum cleaner.
"His ultimate goal other than the pure satisfaction of building the thing and getting it running, as El Reg reported in June this year, was to show the public how computers work by blowing the CPU up to a human-viewable scale."
Here is some more Tech News from around the web:
- Diamond Batteries That Last For Millennia @ Hack a Day
- Surface Studio torn down: Surprisingly upgradable storage @ Ars Technica
- 'DroneGun' Can Take Down Aircraft From Over 1.2 Miles Away @ Slashdot
- Neutralizing Intel’s Management Engine @ Hack a Day
- Snoopers' Charter: 'Draconian' IP Bill receives royal assent to become law @ The Inquirer
- Current switches insulator’s magnetic state @ Nanotechweb
- Storage newbie: You need COTS to really rock that NVMe baby @ The Register
- The Pokeball Power Bank @ Tech ARP
Subject: Processors | November 29, 2016 - 02:26 AM | Scott Michaud
Tagged: amd, Zen, Summit Ridge
Guru3D got hold of a product list, which includes entries for AMD’s upcoming Zen architecture.
Four SKUs are thus rumored to exist:
- Zen SR3: (65W, quad-core, eight threads, ~$150 USD)
- Zen SR5: (95W, hexa-core, twelve threads, ~$250 USD)
- Zen SR7: (95W, octo-core, sixteen threads, ~$350 USD)
- Special Zen SR7: (95W, octo-core, sixteen threads, ~$500 USD)
The sheet also states that none of these are supposed to contain integrated graphics, like we see on the current FX line. There is some merit to using integrated GPUs for specific tasks, like processing video while the main GPU is busy or doing a rapid, massively parallel calculation without the latency of memory copies, but AMD is probably right to not waste resources, such as TDP, fighting our current lack of compatible software and viable use cases for these SKUs.
Image Credit: Guru3D
The sheet also contains benchmarks for Cinebench R15. While pre-rendered video is a task that really should be done on GPUs at this point, especially with permissive, strong, open-source projects like Cycles, they do provide a good example of multi-core performance that scales. In this one test, the Summit Ridge 7 CPU ($350) roughly matches the Intel Core i7-6850K ($600), again, according to this one unconfirmed benchmark. It doesn’t list clock rates, but other rumors claim that the top-end chip will be around 3.2 GHz base, 3.5 GHz boost at stock, with manual overclocks exceeding 4 GHz.
These performance figures suggest that Zen will not beat Skylake on single-threaded performance, but it might be close. That might not matter, however. CPUs, these days, are kind-of converging around a certain level of per-thread performance, and are differentiating with core count, price, and features. Unfortunately, there doesn’t seem to have been many leaks regarding enthusiast-level chipsets for Zen, so we don’t know if there will be compelling use cases yet.
Zen is expected early in 2017.
Subject: Graphics Cards | November 29, 2016 - 12:34 AM | Scott Michaud
Tagged: amd, graphics drivers
For tomorrow’s Watch_Dogs 2, AMD has released Radeon Software Crimson Edition 16.11.5 graphics drivers, giving users a day to configure their PCs. Note that, while the download links in the release notes say 16.11.4, hovering your mouse over them shows the correct version, dated last Friday. Don’t worry, though, the Radeon Technologies Group is based out of Markham, Ontario, Canada, so they didn’t miss out on turkey leftovers to bring you this software.
Okay, yes, that joke was lame. Moving on.
Beyond Watch_Dogs 2, this driver release also adds a new CrossFire profile for Dishonored 2 for Windows 8.x and Windows 10, so multiple GPU users of that game might want to upgrade, too. Beyond that, flickering in The Division and Battlefield 1 while using CrossFire is also addressed.
There are quite a few known issues, though, including a few crashes when using the Vulkan API. Most of these known issues were present in 16.11.4 from a couple of weeks ago, including the aforementioned Vulkan crashes, but this driver adds two. The CrossFire profile for Dishonored 2 that was added with this driver will be disabled on Windows 7, although it sounds like that will be fixed in a future release. Also, Watch_Dogs 2 may flicker or crash when using Crossfire with two RX 480s, but apparently not other configurations.
The driver is not signed by WHQL, but I think I prefer what AMD’s doing now, rapidly releasing several drivers a month, addressing issues as they arise, versus a Microsoft stamp of approval. All that matters is that they can be installed on Anniversary Edition clean installs with Secure Boot enabled, and they can.
Subject: Graphics Cards | November 28, 2016 - 11:20 PM | Scott Michaud
Tagged: nvidia, graphics drivers
Because holiday shopping is... wrapping up... this year’s rush of AAA games will be slowing down soon, at least until it starts up again in January. One of the last releases, Watch_Dogs 2, will be arriving on the PC tomorrow. As such, NVIDIA has released GeForce 376.09 drivers out to their website and GeForce Experience. The driver also includes optimizations for Dead Rising 4 and Steep.
Unfortunately, the release notes aren’t yet available as of time of this writing (but the link is). As such, we don’t know specifics about what the driver fixes or changes. The notes are supposed to be up at some time today. Users in the forums have been complaining about a few things here and there, but nothing that seems credible and wide-spread that could be attributed to the driver.
Subject: Cases and Cooling | November 28, 2016 - 07:18 PM | Jeremy Hellstrom
Tagged: modular psu, in win, in win classic, 750w, 80 Plus Platinum
In Win are a venerable supplier of PSUs who slowly faded into the background as the PSU market has grown to include numerous manufacturers and resellers. They are looking to get back into the minds of shoppers with their new Classic series, sporting a fully modular design, separated 12V rails, alumumium exteriors and an 80 Plus Platinum rating. Inside the unit you will find Nippon Chemi-con and a clean design which impressed [H]ard|OCP. Looks are not the only important thing when choosing a PSU however, check out the full review to see how well the In Win Classic 750W performed.
"It has been years since we have reviewed a computer power supply from In Win. You might remember the In Win name from being a prolific case supplier back in the early enthusiast days of the 1990's. How does In Win stack up in 2016 with its Classic series PSU that has a very sleek look to it and nice feature set?"
Here are some more Cases & Cooling reviews from around the web:
- Thermaltake Smart DPS G 750W Power Supply Unit Review @ NikKTech
- Super Flower Platinum King 650W @ Kitguru
- Enermax Platimax DF 600W @ [H]ard|OCP
- Thermaltake Toughpower SFX 600W @ [H]ard|OCP
Subject: General Tech | November 28, 2016 - 06:44 PM | Jeremy Hellstrom
Tagged: Raspberry Pi 3, plex, pandora, Netflix
***This is your own personal Netflix seeing as how you are no longer able to access Netflix on "unofficial" devices. Check the comments for great info.**
Over at Linux.com you can find instructions on making a very inexpensive headless Plex Media Server. You will need a working PC to start up the installation by formatting an SD card and setting it up with NOOBS. A little configuration work on the Pi, linking it to your locally stored video libraries and online content such as CNN and Netflix and you have a media centre ready for use, for well under $100. Maybe you could consider making one as a gift for someone deserving. The full instructions and parts list can be found here.
"No, you don’t have to buy an expensive, bulky PC. All you need is a Raspberry Pi 3, a hard drive, an SD card and a mobile charger. It should all cost less than $100."
Here is some more Tech News from around the web:
- Google's Chromecast is causing boot-loops on some router models @ The Inquirer
- Azure glitch allowed attackers to gain admin rights over hosted Red Hat Linux instances @ The Inquirer
- Ransomware locks up San Francisco public transportation ticket machines @ Ars Technica
- 2.1Gbps speeds over LTE? That's not a typo, EE's already done it @ The Register
- iOS 10.1.1 Is Causing Battery Issues For Many iPhone Users @ Slashdot
Subject: Systems, Mobile | November 27, 2016 - 09:25 PM | Scott Michaud
Tagged: virtual boy, RISC, Nintendo, nec
I was one of the lucky kids who got a Virtual Boy, which was actually quite fun for nine-year-old me. It wasn’t beloved by the masses, but when you’re in a hotel, moving across the country, you best believe I’m going to punch that Teleroboxer cat in the head, over and over. It was quite an interesting piece of technology, despite its crippling flaws.
To see for yourself, Ben Heck published a full disassemble, with his best-guess explanations. He then performs a repair by 3D printing a clamp to put pressure on a loose ribbon connector.
From a performance standpoint, the Virtual Boy was launched with a 32-bit NEC RISC processor, clocked at 20 MHz. Keep in mind that, one, this is a semi-mobile, battery-powered device and, two, it launched around the same time as the original Pentium processor reached 120 MHz. The RAM setup is... unclear. I’m guessing PlanetVB accidentally wrote MB and KB to refer to “megabit” (Mb) and “kilobit” (kb) instead of “megabyte” and “kilobyte”, meaning the Wikipedia listing of 128KB VRAM, 128KB DRAM, and 64KB WRAM is accurate. The cartridge could also address up to an additional 16MB of RAM, meaning that specific titles could load as much as they need, albeit at a higher BOM cost. Shipped titles maxed out at 8KB of cartridge-expanded RAM, though.
Ben Heck’s video will be part of a series, where he will try to make it smaller and head-mounted.
Subject: General Tech | November 26, 2016 - 11:49 PM | Scott Michaud
Tagged: pc gaming, ubisoft, ea, bethesda
The Ubisoft store is offering the standard edition of either The Division, Assassin’s Creed: Syndicate, Rainbow Six: Siege, or Far Cry Primal when you purchase (or pre-order) another, participating title. These other games aren’t just from Ubisoft, though. They also include new releases from EA, Bethesda, and SquareEnix, such as Battlefield 1 (which still requires Origin) and Skyrim: Special Edition.
This is interesting for two reasons. First, and most obvious, if you really want one of the four titles and one of the applicable ones, then it might be cheaper than buying them individually (although you should check for sales elsewhere first).
The second point regards how the various publishers are handling Steam’s dominance in the PC space. EA is now even participating their titles, which are not available on Valve’s service, in promotions from stores owned by other competitors. Meanwhile, it seems like Bethesda is happy putting their stock wherever, and they will even discount games by a third or a half if it aligns with a big Steam Sale. Then we get Ubisoft, who has their own store, but also lists on Steam and does fairly good sales there, too.
Anyway, the sale is running until the 27th. As I said earlier, though, be sure that any combination of game that interests you is actually cheaper than their respective sale price at competing stores before buying.
Subject: Systems | November 26, 2016 - 09:21 PM | Scott Michaud
Tagged: Samsung, Lenovo
thebell, a Korean news outlet and sister site of ZDNet Korea, published a rumor that Samsung was in talks to sell their PC business to Lenovo. While I’m struggling with the Google Translate from Korean, it sounds like this would be caused by Samsung selling their printing business to HP, leading to the company divesting from related markets, too. This news was picked up by the American ZDNet and, some time after, Samsung released a statement outright denying the rumor: “The rumor is not true.”
So, as far as we know, Samsung is staying in the PC market.
Since it was a clear denial, not a decline to comment, this probably means that the rumor is either completely false, or, if it’s based on a kernel of truth, it’s very early or very tiny. It seems likely, though, that Lenovo would want to buy up pretty much anyone’s PC business at this point, if the price is right. As for Samsung selling? I could see it being something that could have been discussed behind-the-scenes to some level of seriousness, although that’s what hoaxes prey upon. Again, as far as we know, Samsung will keep their PC business, and there isn’t really anything concrete to say otherwise.
Subject: General Tech | November 26, 2016 - 01:51 AM | Scott Michaud
Tagged: pc gaming, VR, osvr, razer, sensics
There’s a few competing VR standards at the moment. Obviously, mobile has a bunch of them; Google technically has two of their own. On the PC, the top two are Oculus and SteamVR. A third one, Open Source Virtual Reality (OSVR), was co-founded by Razer and Sensics.
Valve has now added their platform to Steam, including the tools that users will need to filter compatible content for that headset.
OSVR is an interesting initiative. For instance, when they released their second developer’s kit, HDK2, they also released an upgrade kit for the original. Currently priced at $220, it upgrades the screen to 2160x1200. They also have a Leap Motion upgrade, although that’s currently listed as “coming soon”. It has also been added to Unreal Engine 4 for the last few versions, so engine developers are considering it worthy of first-party support.
Subject: Displays | November 26, 2016 - 01:29 AM | Scott Michaud
Tagged: AOC, 240Hz, freesync
This is just getting silly. While TN, 1080p monitors have been fading into the background, they are fast switching, and AOC is pushing that advantage. The AOC AGON AG251FZ is a 25-inch FreeSync display that can support up to 240 Hz refresh rates. They’re not the first monitor to reach this milestone, as Acer made a similar announcement back in August, but this display should be bright and smooth, especially for our readers with AMD GPUs.
If you like to smoothly scroll documents, then you may also appreciate that its stand can pivot into portrait mode. I doubt it will have the best color representation, though, so those who want to photo edit, especially outside of sRGB, may want to look elsewhere. In fact, they don’t even list their sRGB (web and video) or AdobeRGB (video and print) coverage. I’d hope it would at least have 100% sRGB, but I can’t say for sure.
Subject: General Tech | November 26, 2016 - 12:47 AM | Scott Michaud
Tagged: Japan, supercomputer
According to Reuters, Japan’s Ministry of Economy, Trade and Industry have set aside 19.5 billion yen to build a high-end supercomputer. This will translate into 130 PetaFLOPs, which would put it ahead of all other announced clusters. The article claims that the government will rent the computer out to Japanese corporations, many of which currently use American-based cloud services.
The supercomputer has been named ABCI: AI Bridging Cloud Infrastructure.
Image Credit: つ via Wikipedia
From a hardware standpoint? There’s not a whole lot else to say about it. The money has been set aside, but no-one has been selected to build it. Companies will submit their bids by December 8th, and we assume they’ll make an announcement at some point after.
This also means we don’t know what is planned to go into each node. Despite targeting ABCI at AI, Japan is sticking to the “FLOPs” rating, and thus will probably be focused on floating-point workloads. It would be weird to see such an expensive machine be focused on 8- or 16-bit instructions, but then we see Google creating custom ASICs, called TPUs, that seem to get huge performance boosts by sticking to low-precision workloads. Could that even scale to a competitive supercomputer? Or would it cut out too many potential customers that need 32- and 64-bit precision?
Either way, I would guess that this computer will use more conventional, GPU-style co-processors from someone like Intel (Xeon Phi) or NVIDIA. Really, we don’t know, though. No-one does at this point. It’s an interesting branding, though.
Subject: General Tech | November 25, 2016 - 11:39 PM | Scott Michaud
Tagged: logitech, mouse, keyboard, g900, g810
I braved the Black Friday lines... at about three in the afternoon, because this guy isn’t going to get trampled for discounts on computer hardware. Luckily, Best Buy still had a single G900 Chaos Spectrum mouse in stock at 50% off, and a few G810 Orion Spectrum keyboards at about 35% off. I was actually looking to pick them up on Boxing Week if they dropped in price, because I surprisingly needed another mechanical keyboard, but this is even better than I expected.
So, I picked up one of each.
One of the things that attracted me to the G900 was its ambidextrous design with a tilt scroll wheel. It’s surprisingly hard to get a mouse for left-handed users that also has four directions of scrolling. The 2014 left-handed edition of the Razer Naga has a tilt wheel, although its left and right mouse buttons are swapped, so those who are used to right-handed mice will need to wait until Razer Synapse loads and connects to reverse them to left-on-left and right-on-right. What I’m trying to say is that, for the last two years, my old mouse would have left button right-click and right button left-click until my profile abruptly kicked in about 30 seconds after login. I don’t need to deal with that anymore, while still keeping the mouse tilt wheel.
I did notice that Logitech’s G Software refuses to allow binding scroll wheel input to mouse buttons (which I attach to my thumb buttons for comfortable scrolling). Both EVGA and Razer allow this, albeit you need to perform a full click for each notch, short of writing an AutoHotkey macro. It’s not too bad, because you can bind the keyboard’s up and down arrows instead, but scrolling and arrows might not behave the same in all applications, such as with Tweetdeck.
As for the G810, this keyboard feels really nice. The coating of the keycaps are nice and non-stick, the RomerG switches feel pretty good to me, keeping in mind my favorite Cherry MX switch is the MX Brown, and the keyboard’s feet are possible the best I’ve used. There are actually two sets of feet: one set that inclines the keyboard to about 4 degrees, and another that raises it to about 8 degrees. (These values are written on them.) Even better, it’s stable and takes quite a bit of force to slide.
I would prefer it to have a couple of macro keys, even a single row of them, but there’s only so much I can ask for. The media keys are RGB backlit and surprisingly clicky. I’m not sure what type of switch they use, but it feels mechanical... but a very short one like you would see on a mouse, not a keyboard. The G810 also has a volume roller, which I was a huge fan of when I was introduced to it with the first generation of Corsair K60 and K90 mechanical keyboards. (If another brand did it before them, in 2012, then I’m sorry! Corsair was the first that I’ve seen do it!) I should note that the Logitech roller is a bit smoother than the Corsair one, but, again, the K60 and K90 are about four years old at this point.
So yeah, that’s about it.
Subject: Graphics Cards | November 25, 2016 - 06:21 PM | Jeremy Hellstrom
Tagged: msi, gtx 1070, GTX 1070 Quick Silver, factory overclocked
MSI's new Quick Silver design looks very different from most of their other cards, black and silver with a shiny metal backplate as opposed to the red and black we are used to. The GTX 1070 which TechPowerUp reviewed has a bit of a factory overclock, the base Core clock is 76MHz higher than the default at 1582MHz though they have left the VRAM at the default frequency. There is headroom left in the card, TechPowerUp hit a stable 2101MHz Core, 2290MHz VRAM, not the best results they have seen but certainly a decent increase. Drop by for a look at its performance in over a dozen games.
"MSI's GTX 1070 Quick Silver does away with the red-and-black color theme and uses stylish silver instead. Thanks to the powerful cooler from the GTX 1070 Gaming Z, the card is the coolest and quietest GTX 1070 we ever tested. It also comes at a rather affordable $425."
Here are some more Graphics Card articles from around the web:
- ASUS ROG STRIX GTX 1080 Gaming A8G @ Kitguru
- ZOTAC GeForce GTX 1060 Mini 3 GB @ techPowerUp
- Zotac GeForce GTX 1050 Ti @ Hardware Secrets
- XFX Radeon RX 470 @ Hardware Secrets
Tesla stores your Owner Authentication token in plain text ... which leads to a bad Ashton Kutcher movie
Subject: General Tech | November 25, 2016 - 05:52 PM | Jeremy Hellstrom
Tagged: Android, Malware, hack, tesla, security
You might expect better from Tesla and Elon Musk but apparently you would be dissappointed as the OAuth token in your cars mobile app is stored in plain text. The token is used to control your Tesla and is generated when you enter in your username and password. It is good for 90 days, after which it requires you to log in again so a new token can be created. Unfortunately, since that token is stored as plain text, someone who gains access to your Android phone can use that token to open your cars doors, start the engine and drive away. Getting an Android user to install a malicious app which would allow someone to take over their device has proven depressingly easy. Comments on Slashdot suggest it is unreasonable to blame Tesla for security issues in your devices OS, which is hard to argue; on the other hand it is impossible for Telsa to defend choosing to store your OAuth in plain text.
"By leveraging security flaws in the Tesla Android app, an attacker can steal Tesla cars. The only hard part is tricking Tesla owners into installing an Android app on their phones, which isn't that difficult according to a demo video from Norwegian firm Promon. This malicious app can use many of the freely available Android rooting exploits to take over the user's phone, steal the OAuth token from the Tesla app and the user's login credentials."
Here is some more Tech News from around the web:
- CERT tells Microsoft to keep EMET alive because it's better than Win 10's own security @ The Register
- Amazon Makes Good On Its Promise To Delete 'Incentivized' Reviews @ Slashdot
- Tech giants warn IoT vendors to get real about security @ The Register
- 8 of the best outdoor gadgets and accessories @ The Inquirer
Subject: General Tech | November 25, 2016 - 12:01 PM | Scott Michaud
Tagged: x86, windows 10, microsoft, arm
According to Mary Jo Foley at ZDNet, Microsoft is working on emulating the x86 instruction set on ARM64. Her sources further claim that this is intended to be a Windows 10 feature that is targeting Redstone 3, which is the feature update expected in late 2017 (after the upcoming Creators Update in early 2017). Of course, Microsoft will not comment on this rumor. Mary Jo Foley is quite good at holding out on publishing until she gets multiple, independent sources, though. Still, projects slip, pivot, and outright die all of the time, even if the information was true at one point.
Media Center is still dead, though.
So, while keeping in mind that this might not be true, and, even if it is, it could change: let’s think.
The current speculation is that this might be aimed at enterprise customers, including a potential partnership with HP and Qualcomm. This makes sense for a few reasons, especially when you combine it with Microsoft and Samsung’s recent efforts to port .NET Core to ARM. Combining rumors like this might be akin to smashing two rocks together, but you never know if it’ll spark something. Anyway, you would expect these sorts of apps could jump architectures fairly well, because they’re probably not real-time, form-based applications. You might be able to get a comfortable enough user experience, even with the inherent overhead of translating individual instructions.
Another possibility is that Microsoft hasn’t given up on the Windows 8 / Windows RT vision.
Back in that era, the whole OS seemed designed to push users toward their new platform, Metro. The desktop was an app, and that app contained all of the Win32 bits, isolating them from the rest of the PC and surrounding that tile with everything WinRT. The new platform was seductive for Microsoft in a few ways. First, it was more secure, and people considered Windows the operating system that’s plagued with malware. Second, it let them assert control over their apps, like Apple does with their App Store. At the time, they even demanded that third-party web browsers be nothing more than re-skins of Internet Explorer. Firefox? Don’t even think about bringing Gecko in here. It’s Trident or bust.
Say what you like about those first two points, I know I have, and often disapprovingly from an art enthusiast standpoint, but there was a third one that also interested Microsoft:
The WinRT runtime, when it was first unveiled, was pretty much designed in a way that Microsoft could swap out everything underneath it if they wanted to jump ship and move to a new architecture. At the time, almost a decade ago, Intel wasn’t competitive against ARM in the mobile space. This kept Windows applications, and Microsoft, watching the rest of the world sail away.
But supporting both ARM and x86 isn’t good enough. What if IBM wins next time? Or a completely different instruction set? If everything calls an API that can be uprooted and transplanted elsewhere? There will never need to be this mobile concern again.
But then we have this whole decades of stuff that already exists problem. While I don’t like the frog boil analogy, it could be Microsoft’s attempt to uproot enough x86-locked content that people can accept UWP. I’m not sure that will work out, especially since we rely upon real-time software that is not accepting Windows Store, but it might be their goal.
What do you all think?
Subject: General Tech | November 25, 2016 - 09:17 AM | Scott Michaud
Tagged: valve, steam, pc gaming, black friday
Okay, I admit it: I’m a little late on this one. Sorry, all! Sometimes you need to shelf a post because it’s taking forever to write, but you only realize it after days of researching and editing have gone by. In the mean time, simple posts, like this one, begin to collect dust in the queue. You just need to know when to let go, even if it’s temporarily. This time I didn’t.
Oh well. So Valve decided to host their Autumn Sale from now until 1pm (EST) on Tuesday. To me, a sale that starts just before American Thanksgiving and ends hours after Cyber Monday... seems like a Black Friday sale. They even acknowledge it as such in their announcement, so I guess I’m not alone.
There really isn’t much to say, though. Gabe Newell will get your money via big discounts on new and bundled back catalog games... oh wait, there is. Remember how Steam was pushing “Discovery” with their new store aesthetic? How it was supposed to help users find relevant content within their store? They just decided to create “The Steam Awards”, which are user-nominated through the store listing.
This is quite interesting. From Steam’s perspective, it allows a handful of games to get promoted to a wider audience, which could allow some games break out of their niche. On the other hand, since it is user-selected, it would need a niche to have a chance at that exposure. Whether it helps good games find an audience that would otherwise die off? Not sure. I am interested to see, if this really is a phase in the Discovery initiative, what else will be introduced. Time will tell...
Subject: Mobile | November 24, 2016 - 06:41 PM | Jeremy Hellstrom
Tagged: ubuntu, Oryx Pro, GTX1060, gaming laptop, desktop replacement
The Oryx Pro is the opposite of most of the laptops you have seen reviewed here recently. At 15.2x10.7x1.1" and 5.5lbs it is bulkier than the slim laptops dominating the market, not to mention the 2lb power brick. It also runs Ubuntu 16.04 LTS as opposed to Win10, thankfully the install is well configured for the hardware present according to the review at Ars Technica. The hardware on the other hand is familiar and rather impressive, a desktop class GTX 1060, an i7-6700HQ, 32GB of RAM, and a 256GB SSD. The model reviewed at Ars runs you almost $1900 or there is a 17" model, as well as a GTX 1070 upgrade available if you so desire. Pop by to take a look at the full review of this Linux powered laptop.
"System76 has a decent range of laptops, from the small, lightweight, battery-sipping Lemur to the top-end beast-like Oryx Pro. And after recently reviewing the svelte, but not necessarily top-end-specced Dell XPS 13, I got curious about this Oryx Pro. On paper, it sounds like a desktop machine somehow packed into a laptop form factor"
Here are some more Mobile articles from around the web:
More Mobile Articles
- Gigabyte AORUS X7 DT v6 GTX 1080 Gaming Laptop @ eTeknix
- Dell Inspiron 13 5000 (5368 / 5378) 2-in-1 Laptop @ TechARP
- ASUS Chromebox 2-G084U @ Kitguru
- honor 8 Aurora Glass @ TechARP
- Xiaomi Mi Mix review—This is what the future of smartphones looks like @ Ars Technica
Subject: General Tech | November 24, 2016 - 05:35 PM | Jeremy Hellstrom
Tagged: security, hack, audio, Realtec
Security researchers have discovered a way to flip an output channel on onboard Realtec audio into an input channel, thus turning your headphones into an unpowered microphone. The ability of a speaker or headphone to be used as a microphone is not news to anyone who has played around with headphones or input jacks, but it is possible some readers had deprived childhoods and have never tried this. While you cannot mitigate this vulnerability permanently you could certainly notice it as your headphones would no longer play audio if the port is configured as input.
Drop by Slashdot a link, and if you have never tried this out before you really should find an old pair of headphones and experiment with ports as well as snipping off one side of a pair of earbuds. One supposes iPhone 7 users need not worry.
"In short, the headphones were nearly as good as an unpowered microphone at picking up audio in a room. It essentially "retasks" the RealTek audio codec chip output found in many desktop computers into an input channel. This means you can plug your headphones into a seemingly output-only jack and hackers can still listen in. This isn't a driver fix, either."
Here is some more Tech News from around the web:
- Microsoft's nerd goggles will run on a toaster @ The Register
- Microsoft is reportedly sharing Windows 10 telemetry data with third-parties @ The Inquirer
- Google Sends State-Sponsored Hack Warnings To Journalists and Professors @ Slashdot
- Samsung might flog its PC biz to Lenovo for £680m @ The Inquirer
- HTC denies rumors of selling its smartphone business @ DigiTimes
- Samsung fires $70m at quantum televisions @ The Register
- 11 Essential Black Friday Computer Parts for Gamers @ Hardware Secrets
Subject: General Tech | November 23, 2016 - 06:39 PM | Jeremy Hellstrom
Tagged: nvidia, gears of war 4, gaming, dx12, async compute, amd
[H]ard|OCP sat down with the new DX12 based Gears of War 4 to test the performance of the game on a variety of cards, with a focus on the effect of enabling Async Compute. In their testing they found no reason for Async Compute to be disabled as it did not hurt the performance of any card. On the other hand NVIDIA's offerings do not benefit in any meaningful way from the feature and while AMD's cards certainly did, it was not enough to allow you to run everything at maximum on an RX 480. Overall the game was no challenge to any of the cards except perhaps the RX 460 and the GTX 1050 Ti. When playing at 4K resolution they saw memory usage in excess of 6GB, making the GTX 1080 the card for those who want to play with the highest graphical settings. Get more details and benchmarks in their full review.
"We take Gears of War 4, a new Windows 10 only game supporting DX12 natively and compare performance with seven video cards. We will find out which one provides the best experience at 4K, 1440p, and 1080p resolutions, and see how these compare to each other. We will also look specifically at the Async Compute feature."
Here is some more Tech News from around the web:
- Total War: WARHAMMER NVIDIA Linux Benchmarks @ Phoronix
- Total War: Warhammer’s Wood Elves like to shoot and run @ Rock, Paper, SHOTGUN
- Deus Ex: Mankind Divided DX12 Performance @ [H]ard|OCP
- Star Wars Battlefront’s Rogue One DLC on December 6th @ Rock, Paper, SHOTGUN
- AMD Radeon RX 470 Hitman Complete promo goes live @ HEXUS
- Shadow Tactics demo offers Commandos-y stealth @ Rock, Paper, SHOTGUN
- AMD & NVIDIA GPU VR Performance - Google Earth VR @ [H]ard|OCP
- Quick Look: Dark Souls III: Ashes of Ariandel @ GiantBomb
- Origin/EA Black Friday Sale
- AI War 2 returns to Kickstarter, smaller and cheaper @ Rock, Paper, SHOTGUN