All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | November 13, 2016 - 11:24 PM | Scott Michaud
Tagged: optical computing, HPC
We occasionally discuss photonic computers as news is announced, because we're starting to reach “can count the number of atoms with fingers and toes” sizes of features. For instance, we reported on a chip made by University of Colorado Boulder and UC Berkeley that had both electric and photonic integrated circuits on it.
This announcement from Optalysys is completely different.
The Optalysys GENESYS is a PCIe add-in board that is designed to accelerate certain tasks. For instance, light is fourier transformed when it passes through a lens, and reverse fourier transformed when it is refocused by a second lens. When I was taking fourth-year optics back in 2009, our professor mentioned that scientists used this trick to solve fourier transforms by flashing light through a 2D pattern, passing through a lens, and being projected upon film. This image was measured pixel by pixel, with each intensity corresponding to the 2D fourier transform's value of the original pattern. Fourier transforms are long processes to solve algebraically, especially without modern computers, so this was a huge win; you're solving a 2D grid of values in a single step.
These are the sort of tricks that the Optalysys GENESYS claims to use. They claim that this will speed up matrix multiplications, convolutions (fourier transforms -- see previous paragraph), and pattern recognition (such as for DNA sequencing). Matrix multiplications is a bit surprising to me, because it's not immediately clear how you can abuse light dynamics to calculate this, but someone who has more experience in this field will probably say “Scott, you dummy, we've been doing this since the 1800s” or something.
Image Credit: Tom Roelandts
The circles of the filter (center) correspond to the frequencies it blocks or permits.
The frequencies correspond to how quick an image changes.
This is often used for noise reduction or edge detection, but it's just a filter in fourier space.
You could place it between two lenses to modify the image in that way.
From a performance standpoint, their “first demonstrator system” operated at 20Hz with 500x500 resolution. However, their video claims they expect to have a “PetaFLOP-equivalent co-processor” by the end of the 2017. For comparison, modern GPUs are just barely in the 10s of TeraFLOPs, but that's about as useful as comparing a CPU core to a digital signal processor (DSP). (I'm not saying this is analogous to a DSP, but performance comparisons are about as useful.)
Optalysys expects to have a 1 PetaFLOP co-processor available by the end of the year.
Subject: General Tech | November 11, 2016 - 04:36 AM | Scott Michaud
Tagged: pc gaming, htc, htc vive
UploadVR is reporting that a wireless upgrade kit was on display at a trade-show by Alibaba in Shenzhen, China. TPCAST, the company that created the accessory for the headset, is a participant in the Vive X program. This startup accelerator provides $50,000 to $200,000, mentorship, and other support to assist development of VR-related technologies. HTC claims that TPCAST's wireless solution will perform equivalently to the default, wired configuration.
Image Credit: UploadVR
Wireless almost always requires a battery, and HTC claims that two will be available. The default “standard” battery is expected to last about 90 minutes, although they plan a larger battery that fits in the pocket of the individual's clothing. UploadVR doesn't mention anything about price or capacity of this one, although I hope that the wiring from clothes to headset is easily managed.
The upgrade kit will cost about $220, when converted into USD from Chinese Yuan, and begins pre-order on November 11th at 7am PST. The units will ship in early 2017 with current owners of the HTC Vive (authenticated by serial number) getting bumped to the front of the line. I'm guessing this is to gut the scalping market, which is nice, unless they goof and allow unlimited orders for a single serial number.
Subject: Graphics Cards | November 11, 2016 - 01:27 AM | Scott Michaud
Tagged: quarterly earnings, nvidia
The most recent quarter for NVIDIA, which is the three months ending on October 30th, has just passed $2 Billion USD in revenue, an increase of 54% from last year. All said and done, this leads to $542 million in GAAP net income, which is also up 108% from last quarter (or up 120% from the same quarter last year).
NVIDIA doesn't attribute this increase to any specific line of products. Instead, CEO Jen-Hsun Huang takes the opportunity to promote the “years of work and billions of dollars” they spent on the Pascal architecture, applying it all over the place. While I'm guessing a lot of the sales are carried over from last quarter's parts, which are now able to keep up with demand, NVIDIA points to laptop SKUs of 10-series GPUs, the launch of Tesla P4 and P40 GPUs, and initial shipments of the DGX-1 as new and notable for this quarter.
NVIDIA expects to have an even better quarter with the holiday, aimed at $2.1 Billion USD, plus or minus a couple percent. A lot more details are available on NVIDIA's blog, including their Switch announcement with Nintendo, their Drive PX2 platform, and their next-generation Tegra processor, codenamed Xavier.
Podcast #424 - AMD Radeon Pro GPUs, Corsair Carbide Air 740 Review, MSI Gaming Notebook Overview, VRMark, and more!
Subject: General Tech | November 10, 2016 - 07:22 PM | Allyn Malventano
Tagged: VRMark, VR, video, Red Alert 2, radeon pro, podcast, nvidia, notebook, NES Classic, nasa, msi, Mate 9, Leica, laptop, Kirin 960, gaming, DeepMind, carbide air 740
PC Perspective Podcast #424 - 11/10/16
Join us this week as we discuss new AMD Radeon Pro GPUs, Corsair Carbide Air 740 Review, MSI Gaming Notebook Overview, VRMark, and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store (audio only)
- Google Play - Subscribe to our audio podcast directly through Google Play!
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Allyn Malventano, Josh Walrath, and Jeremy Hellstrom
Program length: 1:09:34
Week in Review:
News items of interest:
Hardware/Software Picks of the Week
Subject: Storage | November 10, 2016 - 06:36 PM | Jeremy Hellstrom
Tagged: adata, external ssd, SE730, usb 3.1, type c
At 250GB and 72.7x44x12.2mm (2.8x1.7x0.4") this external SSD from ADATA is small in two ways which is a mixed blessing for mobile storage. You may feel somewhat cramped, however the device is very portable and inexpensive. The Type C to Type A USB 3.1 connection provided up to 427MB/s transfer speeds in The SSD Review's ATTO testing, Crystal Disk showing 341MB/s read and 376MB/s write. While those speeds are not up to the theoretical maximum for USB 3.1 they are still impressive for an external device. Check out the full review right here.
"The ADATA SE730 differs from many other SSDs, however, as it contains the characteristics of being waterproof, dustproof and shockproof, in addition to its small size. If you want storage that will overcome the elements, the SE730 just might be what you're looking for. In addition, this external SSD has a great price and can be found at Amazon for $120."
Here are some more Storage reviews from around the web:
- Drobo 5C USB Type-C DAS @ Kitguru
- TerraMaster F2-220 @ Modders-Inc
- Lexar 128GB Professional 1066x CompactFlash Card @ SSD Review
- Crucial MX300 750 GB @ techPowerUp
- Toshiba OCZ TL100 SSD @ The SSD Review
- Micron 1100 256GB SSD Review @ NikKTech
- Crucial MX300 2TB @ Kitguru
- ADATA Ultimate SU800 240GB SSD Review @ Hardware Canucks
Subject: General Tech | November 10, 2016 - 05:17 PM | Jeremy Hellstrom
Tagged: wifi, usb 3.1, Intel
Rumours have reached the sensitive ears of DigiTimes about the inclusion of USB 3.1 and WiFi chips on Intel's upcoming 300-series chipsets. This move continues the pattern of absorbing secondary systems onto single chips; just as we saw with the extinction of the Northbridge after AMD and Intel rolled the graphics and memory controller hubs into their APUs. This will have an adverse effect on demand from Broadcom, Realtek and ASMedia who previously supplied chips to Intel to control these features. On the other hand this could lower the price AMD will have to pay for those components when we finally see their new motherboards arrive on market. Do not expect to see these boards soon though, the prediction for the arrival of the 300-series of motherboards is still around 12 months from now.
"Intel reportedly is planning to add USB 3.1 and Wi-Fi functions into its motherboard chipsets and the new design may be implemented in its upcoming 300-series scheduled to be released at the end of 2017, according to sources from motherboard makers."
Here is some more Tech News from around the web:
- Android 7.0 Nougat beta available now for Galaxy S7 and S7 Edge owners @ The Inquirer
- Google rejects EU's Android antitrust charges @ The Inquirer
- You mean Office 365 deployments don't secure themselves? @ The Register
Subject: Graphics Cards, Systems | November 10, 2016 - 04:44 PM | Ryan Shrout
Tagged: VR, rift, Oculus, atw, asynchronous timewarp, asynchronous spacewarp, asw
Oculus has announced that as of today, support for Asynchronous Spacewarp is available and active for all users that install the 1.10 runtime. Announced at the Oculus Connect 3 event in October, ASW promises to complement existing Asynchronous Timewarp (ATW) technology to improve the experience of VR for lower performance systems that might otherwise result in stutter.
A quick refresher on Asynchronous Timewarp is probably helpful. ATW was introduced to help alleviate the impact of missed frames on VR headsets and started development back with Oculus DK2 headset. By shifting the image on the VR headset without input from the game engine based on relative head motion that occurred AFTER the last VR pose was sent to the game, timewarp presents a more accurate image to the user. While this technology was first used as a band-aid for slow frame rates, Oculus felt confident enough in its advantages to the Rift that it enables for all frames of all applications, regardless of frame rate.
ATW moves the entire frame as a whole, shifting it only based on relative changes to the user’s head rotation. New Asynchronous Spacewarp attempts to shift objects and motion inside of the scene by generating new frames to insert in between “real” frames from the game engine when the game is running in a 45 FPS state. With a goal of maintaining a smooth, enjoyable and nausea-free experience, Oculus says that ASW “includes character movement, camera movement, Touch controller movement, and the player's own positional movement.”
To many of you that are familiar with the idea of timewarp, this might sound like black magic. Oculus presents this example on their website to help understand what is happening.
Seeing the hand with the gun in motion, ASW generates a frame that continues the animation of the gun to the left, tricking the user into seeing the continuation of the motion they are going through. When the next actual frame is presented just after, the gun will have likely moved slightly more than that, and then the pattern repeats.
You can notice a couple of things about ASW in this animation example however. If you look just to the right of the gun barrel in the generated frame, there is a stretching of the pixels in an artificial way. The wheel looks like something out of Dr. Strange. However, this is likely an effect that would not be noticeable in real time and should not impact the user experience dramatically. And, as Oculus would tell us, it is better than the alternative of simply missing frames and animation changes.
Some ASW interpolation changes will be easier than others thanks to secondary data available. For example, with the Oculus Touch controller, the runtime will know how much the players hand has moved, and thus how much the object being held has moved, and can better estimate the new object location. Positional movement would also have this advantage. If a developer has properly implemented the different layers of abstraction for Oculus and its runtime, separating out backgrounds from cameras from characters, etc., then the new frames being created are less likely to have significant distortions.
I am interested in how this new feature affects the current library of games on PCs that do in fact drop below that 90 FPS mark. In October, Oculus was on stage telling users that the minimum spec for VR systems was dropping from requiring a GTX 970 graphics card to a GTX 960. This clearly expands the potential install base for the Rift. Will the magic behind ASW live up to its stated potential without an abundance of visual artifacts?
In a blog post on the Oculus website, they mention some other specific examples of “imperfect extrapolation.” If your game or application includes rapid brightness changes, object disocclusion trails (an object moving out of the way of another object), repeated patterns, or head-locked elements (that aren’t designated as such in the runtime) could cause distracting artifacts in the animation if not balanced and thought through. Oculus isn’t telling game developers to go back and modify their titles but instead to "be mindful of their appearance."
Oculus does include a couple of recommendations to developers looking to optimize quality for ASW with locked layers, using real-time rather than frame count for animation steps, and easily adjustable image quality settings. It’s worth noting that this new technology is enabled by default as of runtime 1.10 and will start working once a game drops below the 90 FPS line only. If your title stays over 90 FPS, then you get the advantages of Asynchronous Timewarp without the potential issues of Asynchronous Spacewarp.
The impact of ASW will be interesting to see. For as long as Oculus has been around they have trumpeted the need for 90 FPS to ensure a smooth gaming experience free of headaches and nausea. With ASW, that, in theory, drops to 45 FPS, though with the caveats mentioned above. Many believe, as do I, that this new technology was built to help Microsoft partner with Oculus to launch VR on the upcoming Scorpio Xbox console coming next year. Because the power of that new hardware still will lag behind the recommended specification from both Oculus and Valve for VR PCs, something had to give. The result is a new “minimum” specification for Oculus Rift gaming PCs and a level of performance that makes console-based integrations of the Rift possible.
Subject: Graphics Cards | November 10, 2016 - 02:52 AM | Scott Michaud
Tagged: graphics drivers, dishonored 2, crimson, amd
Just a handful of days into this busy month for video game companies, and AMD has released their third Radeon Software Crimson Edition drivers for November. 16.11.3, like 16.11.2 and 16.11.1, are not certified by WHQL. From a quality standpoint, Microsoft certification hasn't exactly made a difference over the last year or so. In fact, both graphics vendors rapidly releasing hotfixes between regular WHQL milestones seems to have a better user experience.
Unfortunately, this does mean that users of clean installed Windows 10 1607 with Secure Boot enabled will be missing out. Correction: The drivers are actually signed by Microsoft with the attestation process.
As for the driver itself, 16.11.3 rolls in AMD's optimizations for Dishonored 2. The game goes live in two days, so this should give users an opportunity to find a good time to install and reboot before launch. It also fixes an issue where Valve's Steam client and EA's Origin client would fail when an external GPU, using AMD's X-Connect Technology standard, is detached.
Subject: General Tech | November 10, 2016 - 12:05 AM | Scott Michaud
Tagged: ubisoft, pc gaming, free games, free
Ubisoft has been giving away a game for free to all who claim it, once per month. If you do, then it is yours forever. If not, then you missed it. The most recent entry is FarCry 3: Blood Dragon, which is a standalone spin-off of the Einstein-quoting island shooter that parodies 80s action content. These games will be delivered by their UPlay digital distribution platform, and you require an Ubisoft account to claim it, but that's your choice to make for free content.
We're almost at the end of Ubisoft's 30th anniversary promotion, with just a single title left. I'm not sure what it is, but I'm guessing it has some significance to the company and, like the announcement of a sequel to Beyond Good and Evil, could be accompanied by larger news.
Subject: General Tech | November 9, 2016 - 11:25 PM | Scott Michaud
Tagged: pc gaming, VR, ue4, red alert, command and conquer
Command and Conquer: Red Alert 2 was a 2D real-time strategy game about a science-fiction alternate universe version of Cold War Allies vs Soviets. The base-building mechanic involved collecting funds from captured neutral structures and harvesting resources throughout the map. Ádám Horváth, a fan of the series, with 3D assets created by an artist who goes by the name Slye_Fox, created a VR implementation in Unreal Engine 4.
The interface implementation is quite interesting in particular. It looks almost like someone hovering over a board game, interfacing with the build menu via a virtual hand-held tablet. The game mechanics look quite complete, with even things like enemy AI and supply crates (although think the camera didn't catch when it was actually picked up) implemented. It definitely looks good, and looks like it could form the basis for a full real-time strategy interface for VR.
Subject: Systems | November 9, 2016 - 08:31 PM | Jeremy Hellstrom
Tagged: VR, vive, rift, Oculus, htc, build guide, amd
Neoseeker embarked on an interesting project recently; building a VR capable system which costs less than the VR headset it will power. We performed a similar feat this summer, a rig which at the time cost roughly $900. Neoseeker took a different path, using AMD parts to keep the cost low while still providing the horsepower required to drive a Rift or Vive. They tested their rig on The Lab, Star Wars: Trials on Tatooine and Waltz of the Wizard, finding the performance smooth and most importantly not creating the need for any dimenhydrinate. There are going to be some games this system struggles with but at total cost under $700 this is a great way to experience VR even if you are on a budget.
"Team Red designed this system around their very capable Radeon RX 480 8GB video card and the popular FX-6350 Vishera 6-Core CPU. The RX 480 is obviously the main component that will not only be leading the dance, but also help drive the total build cost down thanks to its MSRP of $239. At the currently listed online prices, the components for system will cost around $660 USD in total after applicable rebates."
Here are some more Systems articles from around the web:
- Intel Kaby Lake Linux Testing With MSI's Cubi 2 Mini PC @ Phoronix
- MSI Aegis Ti (GTX 1080 SLI) Gaming PC @ Kitguru
- Gigabyte BRIX i7A-7500 @ Kitguru
- Freshtech Solutions Project 7 GTX 1080 Gaming PC @ eTeknix
Subject: General Tech | November 9, 2016 - 07:03 PM | Jeremy Hellstrom
Tagged: brookhaven experiment, VR, amd, nvidia, htc vive
[H]ard|OCP has a new Vive title to test on AMD and NVIDIA silicon, a wave shooter with some horror elements called The Brookhaven Experiment. As with most of these games they found some interesting results in the testing, in this case the GPU load stayed very consistent, regardless of how much was on the screen at any time. The graphical settings in this title are quite bare but it does support supersampling, which [H]ard|OCP recommends you turn on when playing the game, if your system can support it. Check out the rankings in their full review.
"If naked mutants from another dimension with horribly bad skin conditions interests you, this is YOUR VR game! The Brookhaven Experiment is a tremendously intense 360 degree wave shooter that will keep you on your toes, give you a workout, and probably scare the piss out of you along the way. How do AMD and NVIDIA stack up in VR?"
Here is some more Tech News from around the web:
- How To Get Old Skyrim Saves Working In The Skyrim Special Edition @ Rock, Paper, SHOTGUN
- Call of Duty: Infinite Warfare PC graphics benchmark performance @ Guru of 3D
- Wot I Think – Call Of Duty: Infinite Warfare’s Campaign @ Rock, Paper, SHOTGUN
- Call of Duty Infinite Warfare: Performance Analysis @ techPowerUp
- Space Hulk: Deathwing Stomping Into December @ Rock, Paper, SHOTGUN
- Lethal VR: A potent VR shooter from the creators of Burnout @ Ars Technica
- Mass Effect: Andromeda Trailer Shows New Worlds @ Rock, Paper, SHOTGUN
- Far Cry 3: Blood Dragon revealed as the next Ubi30 gaming freebie @ HEXUS
- Dishonored 2 Trailer Champions A Cutthroat Empress @ Rock, Paper, SHOTGUN
Subject: General Tech | November 9, 2016 - 06:10 PM | Jeremy Hellstrom
Tagged: hack, iot, phillips, hue
If you were hoping to drive someone a wee bit crazy by remote controlling their light bulbs you have probably missed your opportunity as Phillips have patched the vulnerability. This is a good thing as it was a very impressive flaw. Security researchers figured out a vulnerability in the ZigBee system used to control Phillips Hue smart light bulbs and they did not need to be anywhere near the lights to do so. They used a drone from over 1000 feet away to break into the system to cause the lights to flash and even worse, they were able to ensure that the bulb would no longer accept firmware updates which made their modifications permanent. Unpatched systems could be leveraged to turn all the lights off permanently, or to start an unexpected disco light show if you wanted to be creative. You can pop by Slashdot for a bit more information on the way this was carried out.
"Researchers were able to take control of some Philips Hue lights using a drone. Based on an exploit for the ZigBee Light Link Touchlink system, white hat hackers were able to remotely control the Hue lights via drone and cause them to blink S-O-S in Morse code. The drone carried out the attack from more than a thousand feet away."
Here is some more Tech News from around the web:
- NASA Puts its 3D Models Up on GitHub @ Hack a Day
- Google to patch Chrome mobile hole after bank trojan hits 318k users @ The Register
- Microsoft prises open Azure containers, pours in a little Kubernetes @ The Register
- TSMC board approves US$4.91 billion for capacity expansion @ DigiTimes
- Microsoft launches Skype Insiders Programme - but don't tell anyone @ The Inquirer
- Tobii Tracker 4C Review @ OCC
Subject: Cases and Cooling | November 8, 2016 - 09:28 PM | Jeremy Hellstrom
Tagged: supernova, evga, 80 Plus Gold, modular psu, SuperNOVA G3
Lee has reviewed more than a few EVGA SuperNOVA PSUs over the years and they tend to win the Gold as they are very well made. The new G3 models are a bit smaller in volume than the previous G2 PSUs at 85mm high, 150mm wide and 150mm long, previously they tended to be 165mm long. The new size will also mean the Hydraulic Dynamic Bearing fan size is reduced to 130mm, it will continue to support the EVGA ECO mode for those who desire quiet operation. There is a seven year warranty on the 550W and 650W models, the higher powered are covered for a full decade. We don't have a review up so keep an eye out for that, in the mean time have some PR.
November 8th, 2016 - EVGA SuperNOVA power supplies are well-known for their extreme efficiency, performance and reliability. In fact, over the last 3 years EVGA SuperNOVA power supplies have won over 70 awards from leading review sites. EVGA’s dedication to performance has forged the latest power supply platform: the EVGA SuperNOVA G3 Series. With these new power supplies, EVGA took the best features of the award-winning G2 lineup and made them even better. The SuperNOVA G3’s smaller size, improved performance and a new Hydraulic Dynamic Bearing fan give you ultra-quiet performance with an increased lifespan.
Small Size, Big Performance
A reduced size does not mean reduced performance with the EVGA SuperNOVA G3 power supplies. At only 150mm long, this makes these some of the smallest power supplies on the market today, while offering improved performance and features.
Whisper Silent Hydraulic Dynamic Bearing Fan
Better performance, quieter operation and a longer lifespan. EVGA ECO mode gives you gives you silent operation at low to medium loads.
Next Gen Performance
Next-generation systems deserve next-generation performance. The EVGA SuperNOVA G3 takes the already efficient G2 series and makes it even better with improved efficiency and lower ripple. These power supplies are over 91% efficient!
Fully Modular Design
Use only the cables you need, reducing cable clutter and improving case airflow.
Subject: Displays | November 8, 2016 - 08:44 PM | Jeremy Hellstrom
Tagged: ZeroFrame, ips, Acer BE270U, acer, 75hz, 1440p
Acer has just release a new model in their ZeroFrame series of displays, the BE270U. It is a 27" IPS 100% sRGB display @ 1440p resolution with a refresh rate of 75Hz and a 6ms response time. For connectivity you can choose between a pair of Mobile High Definition Link (MHL) ports for charging and displaying from portable devices, DisplayPort 1.2 in and out which allows you to chain displays, MiniDP and a USB 3.0 hub with one up and four down). Two 2W speakers are slipped into the display as well, if you so desire to make use of them as well as Picture in Picture functionality. The display is aimed for content creators and other professionals but it is still capable of offering decent performance when gaming.
It is available for $500 from Acer. PR after the picture.
SAN JOSE, Calif. (Nov. 8, 2016) Acer America today announced the U.S. availability of the Acer BE270U monitor, boasting stunning images with a brilliant WQHD 2560x1440@75Hz resolution and a ZeroFrame design in a 27-inch panel.
This premium, feature-rich display is ideal for graphics professionals, such as art directors, photographers, videographers and website designers as well as enthusiasts who enjoy photography and video editing as hobbies. Thanks to an IPS panel, wide 178-degree viewing angles enhance visual collaboration with others on joint projects or simply sharing photos and videos with friends.
“Our newest monitor delivers gorgeous images in a feature-rich, ergonomic design for customers wanting a no-compromise viewing experience,” said Ronald Lau, Acer America director – stationary computing. “In addition to optimizing viewing comfort, it provides superb ability for multi-tasking with multi-streaming and picture-in picture capability in an energy-efficient design.”
Picture-in-Picture, Multi-Streaming Increase Productivity
Picture-in-picture capability allows customers to watch a movie or video while working. Multi-stream technology supports up to three additional monitors through a single cable leveraging a video hub or daisy-chainable displays via DisplayPort. Thanks to the ZeroFrame design, it provides seamless viewing among all linked monitors.
The Acer BE270U meets the highest standards for color accuracy with 100 percent sRGB coverage and 6-axis color adjustment. It boasts a 16:9 aspect ratio, a 350 cd/m2 brightness and 16.07 million colors. A crisp 100,000,000:1 maximum contrast ratio and a 6ms response time contribute to the stunning picture quality. Acer EyeProtect may help reduce eye fatigue, incorporating several features that take into consideration prolonged usage by heavy users such as programmers, writers and graphic designers. Ergonomic, a multi-function ErgoStand tilts from -5 - 35 degrees, swivels up to 60 degrees, tilts up to 5.9 inches and pivots + 90 degrees. A quick release design lets users separate the monitor from its stand, so it can be VESA wall-mounted to conserve desk space.
This practical monitor provides an array of connectivity options including MHLx2 for charging portable devices, DisplayPort (v1.2), Mini DisplayPort, DisplayPort out+SPK, and a USB 3.0 hub (1 up/4 down) for connecting multiple peripherals simultaneously. Two 2W speakers deliver quality audio.
In addition to being ENERGY STAR and TCO 7.0 qualified, the Acer BE270U monitor is EPEAT Gold registered, the highest level of EPEAT registration available. Mercury-free and LED-backlit, the Acer BE270U reduces energy costs by consuming less power than standard CCFL-backlit displays making them safer for the environment.
Pricing and Availability
The new Acer BE270U monitor is offered through online channel partners in the United States with a three-year limited parts and labor warranty. Estimated selling prices begin at $499.99.
Subject: General Tech | November 8, 2016 - 06:59 PM | Jeremy Hellstrom
Tagged: long term storage, nitrogen vacancy, diamond
Atomic impurities in diamonds, specifically negatively charged nitrogen vacancy centres in those diamonds, could be used for extremely long term storage. Researchers have used optical microscopy to read, write and reset the charge state and spin properties of those defects. This would mean that you could store data, in three dimensions, within these diamonds almost perpetually. There is one drawback, as the storage medium uses light, similar to a Blue-Ray or other optical media, exposure to light can degrade the storage over time. You can read more about this over at Nanotechweb.
"The nitrogen vacancy (NV) centre can be used for long-term information storage. So say researchers at City University of New York–City College of New York who have used optical microscopy to read, write and reset information in a diamond crystal defect."
Here is some more Tech News from around the web:
- Orange Pi Releases Two Boards @ Hack a Day
- The Sega Genesis Is Officially Back In Production @ Slashdot
- What the Dell? NAND flash drought hits Texan monster – sources @ The Register
- 'F*cking crap' aside, Linus Torvalds says Linux 4.9 is coming along nicely @ The Register
- Google plugs Gmail vulnerability that allowed hackers to post from your account @ The Inquirer
- Samsung smartphone explodes in France, and it's not the Galaxy Note 7 @ The Inquirer
- Engineered spinach detects explosives @ Nanotechweb
- Archeer Foldable Solar Charger @ Benchmark Reviews
Subject: General Tech | November 8, 2016 - 08:00 AM | Scott Michaud
Tagged: voco, stylit, premiere pro, clovervr, audition, Adobe
At their annual MAX show, Adobe hosts a keynote called “Sneak Peeks”. Some of theses contain segments that are jaw-dropping. For instance, there was an experimental plug-in at Adobe MAX 2011 that analyzed how a camera moved while its shutter was open, and used that data to intelligently reduce the resulting motion blur from the image. Two years later, the technology eventually made its way into Photoshop. If you're wondering, the shadowy host on the right was Rainn Wilson from the US version of The Office, which should give some context to the humor.
While I couldn't find a stream of this segment as it happened, Adobe published three videos after-the-fact. The keynote was co-hosted by Jordan Peele and, while I couldn't see her listed anywhere, I believe the other co-host is
Elissa Dunn Scott from Adobe. ((Update, November 8th @ 12pm EST: Turns out I was wrong, and it was Kim Chambers from Adobe. Thanks Anonymous commenter!))
The first (and most popular one to be reported on) is VoCo, which is basically an impressive form of text-to-speech. Given an audio waveform of a person talking, you are able to make edits by modifying the transcript. In fact, you are even able to write content that wasn't even in the original recording, and the plug-in will synthesize it based on what it knows of that person's voice. They claim that about 20 minutes of continuous speech is required to train the plug-in, so it's mostly for editing bloopers in audio books and podcasts.
In terms of legal concerns, Adobe is working on watermarking and other technologies to prevent spoofing. Still, it proves that the algorithm is possible (and on today's hardware) so I'm sure that someone else, if they weren't already working on it, might be now, and they might not be implementing the same protections. This is not Adobe's problem, of course. A company can't (and shouldn't be able to) prevent society from inventing something (although I'm sure the MPAA would love that). They can only research it themselves, and be as ethical with it as they can, or sit aside while someone else does it. Also, it's really on society to treat the situations correctly in the first place.
Moving on to the second demo: Stylit. This one is impressive in its own way, although not quite as profound. Basically, using a 2D drawing of a sphere, an artist can generate a material that can be applied to a 3D render. Using whatever they like, from pencil crayons to clay, the image will define the color and pattern of the shading ramp on the sphere, the shadow it casts, the background, and the floor. It's a cute alternating to mathematically-generated cell shading materials, and it even works in animation.
I guess you could call this a... 3D studio to the MAX... ... Mayabe?
The Stylit demo is available for free at their website. It is based on CUDA, and requires a fairly modern card (they call out the GTX 970 specifically) and a decent webcam (C920) or Android smartphone.
Lastly, CloverVR is and Adobe Premiere Pro interface in VR. This will seem familiar if you were following Unreal Engine 4's VR editor development. Rather than placing objects in a 3D scene, though, it helps the editor visualize what's going on in their shot. The on-stage use case is to align views between shots, so someone staring at a specific object will cut to another object without needing to correct with their head and neck, which is unnecessarily jarring.
Annnd that's all they have on their YouTube at the moment.
Subject: General Tech | November 8, 2016 - 02:07 AM | Scott Michaud
Tagged: valve, steam, pc gaming
As we mentioned last week, Valve was working on a major refresh of the Steam homepage, with a heavy emphasis on letting users find products that interest them. This update is now live, and will be presented to you the
new next time you load (or reload) the store page. They also have a banner link, right near the top, that highlights changes, including a few they've already made over the course of 2016.
One glaring thing that I note is the “Recently Viewed” block. There doesn't seem to be a way to disable this or otherwise limit the amount of history that it stores. While this is only visible to your account, which should be fairly obvious, it could be a concern for someone who shares a PC or streams regularly. It's not a big issue, but it's one that you would expect to have been considered.
Otherwise, I'd have to say that the update looks better. The dark gray and blue color scheme seems a bit more consistent than it was, and I definitely prefer the new carousel design.
What do you all think?
Subject: Graphics Cards | November 7, 2016 - 11:00 PM | Scott Michaud
Tagged: nvidia, graphics drivers
Update, November 7th @ 5:25pm EST:
First, NVIDIA gave Ryan their official statement, which I included below verbatem.
GeForce Experience collects data to improve the application experience; this includes crash and bug reports as well as system information needed to deliver the correct drivers and optimal settings. NVIDIA does not share any personally identifiable information collected by GeForce Experience outside the company. NVIDIA may share aggregate-level data with select partners, but does not share user-level data. The nature of the information collected has remained consistent since the introduction of GeForce Experience 1.0.The change with GeForce Experience 3.0 is that this error reporting and data collection is now being done in real-time.
They also pointed to their GeForce Experience FAQ.
It sounds like there's a general consensus, both from NVIDIA and even their harshest critics, that telemetry only affects GeForce Experience, and not their base driver. I still believe that there should be a more granular opt-out that still allows access to GeForce Experience, like web browsers and Visual Studio prompt with a checkbox during install. Still, if this concerns you, and, like Windows 10, it might not and that's okay, you can remove GeForce Experience.
Also, GamersNexus yet again did a very technical breakdown of the situation. I think they made an error, though, since they claimed to have recorded traffic "for about an hour", which may not have included the once-per-day reporting time from Windows Task Scheduler. (My image below suggests, at least for my system, monitor once per hour but report at 12:25pm and user login.) I reached out to them on Twitter for clarification, but it looks like they may have just captured GeForce Experience's typical traffic.
Update, November 7th @ 7:15pm EST: Heard back from GamersNexus. They did check at the Windows Task Scheduler time as well, and they claim that they didn't see anything unusual. They aren't finished with their research, though.
Original news, posted November 6th @ 4:25pm EST, below.
Over the last day, users have found NVIDIA Telemetry Monitor added to Windows Task Scheduler. We currently don't know what it is or exactly when it was added, but we do know its schedule. When the user logs in, it runs an application that monitors... something... once every hour while the computer is active. Then, once per day (at just after noon on my PC) and once on login, it runs an application that reports that data, which I assume means sends it to NVIDIA.
Before we begin, NVIDIA (or anyone) should absolutely not be collecting data from personal devices without clearly explaining the bounds and giving a clear option to disable it. Lots of applications, from browsers to software development tools, include crash and error reporting, but they usually and rightfully ask you to opt-in. Microsoft is receiving a lot of crap for this practice in Windows 10, even with their “Basic” option, and, while most of those points are nonsense, there is ground for some concern.
I've asked NVIDIA if they have a statement regarding what it is, what it collects, and what their policy will be for opt-in and opt-out. I haven't received a response yet, because I sent it less than an hour ago on a weekend, but we'll keep you updated.
Subject: Mobile | November 7, 2016 - 08:04 PM | Jeremy Hellstrom
Tagged: msi, GS73 6RF Stealth Pro, 4k, GTX 1060M
You have two choices of display when purchasing an MSI GS73 6RF Stealth Pro, a 120Hz 1080p which is neither FreeSync nor GSYNC or a 4K display. It is the 4K version which Kitguru has reviewed, powered by the mobile version of the GTX 1060, an i7-6700HQ and 16GB of DDR4-2400. Storage is handled by a PCIe based M.2 SSD as well as a HDD for extra storage. Kitguru loved the look of the panel but unfortunately the 1060M just doesn't have the power to game at that resolution; it also came with more third party software than they would have liked but that did not ruin it for them. Check out the full review here.
"MSI have been producing a fine line of gaming-oriented laptops for the last couple of years and today we look at their latest super slimline 17 inch model which features a Core i7 processor, Nvidia GTX 1060 graphics, and a 4k IPS panel along with Steelseries keyboard and Killer networking."
Here are some more Mobile articles from around the web:
More Mobile Articles
- Surface 4 Pro - A Real Laptop Replacement @ Hardware Secrets
- Microsoft's Surface Studio desk-slab, Dial knob, Surface Book: We get our claws on new kit @ The Register
- The honor 8 Aurora Glass Smartphone @ TechARP