Subject: Processors | February 7, 2018 - 09:01 AM | Tim Verry
Tagged: Xeon D, xeon, servers, networking, micro server, Intel, edge computing, augmented reality, ai
Intel announced a major refresh of its Xeon D System on a Chip processors aimed at high density servers that bring the power of the datacenter as close to end user devices and sensors as possible to reduce TCO and application latency. The new Xeon D 2100-series SoCs are built on Intel’s 14nm process technology and feature the company’s new mesh architecture (gone are the days of the ring bus). According to Intel the new chips are squarely aimed at “edge computing” and offer up 2.9-times the network performance, 2.8-times the storage performance, and 1.6-times the compute performance of the previous generation Xeon D-1500 series.
Intel has managed to pack up to 18 Skylake-based processing cores, Quick Assist Technology co-processing (for things like hardware accelerated encryption/decryption), four DDR4 memory channels addressing up to 512 GB of DDR4 2666 MHz ECC RDIMMs, four Intel 10 Gigabit Ethernet controllers, 32 lanes of PCI-E 3.0, and 20 lanes of flexible high speed I/O that includes up to 14 lanes of SATA 3.0, four USB 3.0 ports, or 20 lanes of PCI-E. Of course, the SoCs support Intel’s Management Engine, hardware virtualization, HyperThreading, Turbo Boost 2.0, and AVX-512 instructions with 1 FMA (fuse-multiply-add) as well..
Suffice it to say, there is a lot going on here with these new chips which represent a big step up in capabilities (and TDPs) further bridging the gap between the Xeon E3 v5 family and Xeon E5 family and the new Xeon Scalable Processors. Xeon D is aimed at datacenters where power and space are limited and while the soldered SoCs are single socket (1P) setups, high density is achieved by filling racks with as many single processor Mini ITX boards as possible. Xeon D does not quite match the per-core clockspeeds of the “proper” Xeons but has significantly more cores than Xeon E3 and much lower TDPs and cost than Xeon E5. It’s many lower clocked and lower power cores excel at burstable tasks such as serving up websites where many threads may be generated and maintained for long periods of time but not need a lot of processing power and when new page requests do come in the cores are able to turbo boost to meet demand. For example, Facebook is using Xeon D processors to serve up its front end websites in its Yosemite OpenRack servers where each server rack holds 192 Xeon D 1540 SoCs (four Xeon D boards per 1U sleds) for 1,536 Broadwell cores. Other applications include edge routers, network security appliances, self-driving vehicles, and augmented reality processing clusters. The autonomous vehicles use case is perhaps the best example of just what the heck edge computing is. Rather than fighting the laws of physics to transfer sensor data back to a datacenter for processing to be sent back to the car to in time for it to safely act on the processed information, the idea of edge computing is to bring most of the processing, networking, and storage power as close as possible to both the input sensors and the device (and human) that relies on accurate and timely data to make decisions.
As far as specifications, Intel’s new Xeon D lineup includes 14 processor models broken up into three main categories. The Edge Server and Cloud SKUs include eight, twelve, and eighteen core options with TDPs ranging from 65W to 90W. Interestingly, the 18 core Xeon D does not feature the integrated 10 GbE networking the lower end models have though it supports higher DDR4 memory frequencies. The two remaining classes of Xeon D SoCs are “Network Edge and Storage” and “Integrated Intel Quick Assist Technology” SKUs. These are roughly similar with two eight core, one 12 core, and one 16 core processor (the former also has a quad core that isn’t present in the latter category) though there is a big differentiator in clockspeeds. It seems customers will have to choose between core clockspeeds or Quick Assist acceleration (up to 100 Gbps) as the chips that do have QAT are clocked much lower than the chips without the co-processor hardware which makes sense because they have similar TDPs so clocks needed to be sacrificed to maintain the same core count. Thanks to the updated architecture, Intel is encroaching a bit on the per-core clockspeeds of the Xeon E3 and Xeon E5s though when turbo boost comes into play the Xeon Ds can’t compete.
The flagship Xeon D 2191 offers up two more cores (four additional threads) versus the previous Broadwell-based flagship Xeon D 1577 as well as higher clockspeeds at 1.6 GHz base versus 1.3 GHz and 2.2 GHz turbo versus 2.1 GHz turbo. The Xeon D 2191 does lack the integrated networking though. Looking at the two 16 core refreshed Xeon Ds compared to the 16 core Xeon D 1577, Intel has managed to increase clocks significantly (up to 2.2 GHz base and 3.0 GHz boost versus 1.3 GHz base and 2.10 GHz boost), double the number of memory channels and network controllers, and increase the maximum amount of memory from 128 GB to 512 GB. All those increases did come at the cost of TDP though which went from 45W to 100W.
Xeon D has always been an interesting platform both for enthusiasts running VM labs and home servers and big data enterprise clients building and serving up the 'next big thing' built on the astonishing amounts of data people create and consume on a daily basis. (Intel estimates a single self driving car would generate as much as 4TB of data per day while the average person in 2020 will generate 1.5 GB of data per day and VR recordings such as NFL True View will generate up to 3TB a minute!) With Intel ramping up both the core count, per-core performance, and I/O the platform is starting to not only bridge the gap between single socket Xeon E3 and dual socket Xeon E5 but to claim a place of its own in the fast-growing server market.
I am looking forward to seeing how Intel's partners and the enthusiast community take advantage of the new chips and what new projects they will enable. It is also going to be interesting to see the responses from AMD (e.g. Snowy Owl and to a lesser extent Great Horned Owl at the low and niche ends as it has fewer CPU cores but a built in GPU) and the various ARM partners (Qualcomm Centriq, X-Gene, Ampere, ect.*) as they vie for this growth market space with higher powered SoC options in 2018 and beyond.
- New Intel Xeon D Broadwell Processors Aimed at Low Power, High Density Servers
- Intel Xeon Scalable Processor Launch - New Architecture, New Platform for Data Center
- Qualcomm Centriq 2400 Arm-based Server Processor Begins Commercial Shipment
- Today's bonus AMD rumour: Starship, Naples, Zeppelin and a flock of Owls
*Note that X-Gene and Ampere are both backed by the Carlyle Group now with MACOM having sold X-Gene to Project Denver Holdings and the ex-Intel employee led Ampere being backed by the Carlyle Group.
Subject: General Tech | December 11, 2017 - 06:46 PM | Tim Verry
Tagged: shazam, music streaming, augmented reality, apple music, apple
Apple has confirmed its plans to acquire the London-based company Shazam who is most well-known for its song recognition app for smartphones. The deal, which industry sources estimate to be worth a bit over $400 million, would see Shazam and its employees become part of Apple who has been in talks with Shazam for the past five months and exclusively dating for two.
TechCrunch quotes Apple in stating:
“We are thrilled that Shazam and its talented team will be joining Apple. Since the launch of the App Store, Shazam has consistently ranked as one of the most popular apps for iOS. Today, it’s used by hundreds of millions of people around the world, across multiple platforms. Apple Music and Shazam are a natural fit, sharing a passion for music discovery and delivering great music experiences to our users. We have exciting plans in store, and we look forward to combining with Shazam upon approval of today’s agreement.”
Currently, Shazam is available on a massive number of devices with apps for Android, iOS, Watch OS (Apple Watch), BlackBerry OS, Mac OS, and Windows machines equipped with a microphone. Its apps have been downloaded well over 1 billion times and its users have performed more than 30 billion song searches – "Shazams" – since its launch. The Shazam app allows users to identify songs by recording short clips which Shazam creates a time-frequency spectrograph with to compare to its database of known spectrographs of 11 million songs in an attempt to find a match. IT's not perfect, especially if you are in a loud bar or at home and the song you want to identify is in the background of a TV show with a lot of dialogue over it, but it works for the most part. Shazam has further updated its app through the years to incorporate social networking aspects, link YouTube videos of identified songs, provide links to Amazon Music and Apple Music to purchase the song, and display song lyrics and information on the music artist. The app development company Shazam has also branched out into marketing partnerships as well as image recognition and augmented reality projects which may have also piqued Apple's interest.
Interestingly, Apple was not the only – or even the first – company to approach Shazam about a possible acquisition. Specifically, Snap (the company behind Snapchat) and Spotify were also interested in buying up the London-based developers. While the talks with Spotify fell through, Snap originally approached Shazam six months ago, beating Apple to the punch, but apparently neither company was able to muster up a stable-enough or sizeable enough offer. It is natural that these three companies would be interested in folding Shazam into their own business units since they already have partnerships in place with Shazam for various functionality and marketing reasons. Ars Technica notes that Shazam is used on the backend when asking Siri to identify a song, for example. Further, Spotify members with paid subscriptions could listen to full songs from within the Shazam app, and Shazam can be used within Snapchat to discover and share out songs.
With Apple winning the war for Shazam, I am curious what this will mean for the future of Apple Music as well as the future of the standalone Shazam apps (especially those on non-Apple platforms like the Android app and the song recognition functionality from within third party apps). Bringing Shazam in house is a smart move for Apple which is looking to advance its streaming music service. If anything, it will open the Play Store up for new apps to move in if Apple does pull Shazam inside its walled garden as an Apple exclusive offering.
What are your thoughts on the acquisition? Do you use Shazam?
Subject: General Tech | June 27, 2017 - 01:13 PM | Jeremy Hellstrom
Tagged: Jeri Ellsworth, Rick Johnson, CastAR, augmented reality
The brain child of fomer Valve employees Jeri Ellsworth and Rick Johnson, CastAR, is no more. They were part of the original team at Valve which helped create SteamVR, their focus was on augmented reality applications which Valve eventually decided to drop and Jeri and Rick were allowed to keep the IP which they helped develop. They went on to launch a very successful Kickstarter to help develop their technology and when they eventually received $15 million in investments they chose to return the money invested by their Kickstarter backers; a very different reaction than others have had.
Unfortunately they have not been able to continue to attract investment for their AR products and according to the information Polygon garnered, they have significantly downsized the number of employees and may be seeking to sell their technology. This is exceptionally bad news as their first set of AR goggles were set to launch later this year. The market seems far more willing to invest in VR than it does AR, which presents a large hurdle for smaller businesses to succeed. Hopefully we will hear happier news about Jeri, her team, and CastAR in the future but for now it looks rather bleak.
"In 2013, Technical Illusions got its start with a hugely successful Kickstarter, netting just north of one million dollars. This success drew the attention of investors and eventually led to a funding round of $15 million. With this success, Technical Illusions decided to refund the backers of its Kickstarter."
Here is some more Tech News from around the web:
- Google Slapped With $2.7 Billion By EU For Skewing Searches @ Slashdot
- Ukrainian Banks, Electricity Firm Hit by Fresh Cyber Attack; Reports Claim the Ransomware Is Quickly Spreading Across the World @ Slashdot
- Solving the NVMeF-JBOF-is-not-a-SAN conundrum @ The Register
- Dell drops optical drive price-fixing lawsuit against Hitachi @ The Register
- Linksys EA9500 MAX-STREAM AC5400 MU-MIMO Gigabit Router Review @ NikKTech
- TP-Link Deco M5 Mesh Wi-Fi Router System @ Custom PC Review
- WiMiUs L1 4K Action Cam @ Benchmark Reviews
Subject: General Tech, Processors, Displays, Shows and Expos | August 16, 2016 - 01:50 PM | Ryan Shrout
Tagged: VR, virtual reality, project alloy, Intel, augmented reality, AR
At the opening keynote to this summer’s Intel Developer Forum, CEO Brian Krzanich announced a new initiative to enable a completely untether VR platform called Project Alloy. Using Intel processors and sensors the goal of Project Alloy is to move all of the necessary compute into the headset itself, including enough battery to power the device for a typical session, removing the need for a high powered PC and a truly cordless experience.
This is indeed the obvious end-game for VR and AR, though Intel isn’t the first to demonstrate a working prototype. AMD showed the Sulon Q, an AMD FX-based system that was a wireless VR headset. It had real specs too, including a 2560x1440 OLED 90Hz display, 8GB of DDR3 memory, an AMD FX-8800P APU with R7 graphics embedded. Intel’s Project Alloy is currently using unknown hardware and won’t have a true prototype release until the second half of 2017.
There is one key advantage that Intel has implemented with Alloy: RealSense cameras. The idea is simple but the implications are powerful. Intel demonstrated using your hands and even other real-world items to interact with the virtual world. RealSense cameras use depth sensing to tracking hands and fingers very accurately and with a device integrated into the headset and pointed out and down, Project Alloy prototypes will be able to “see” and track your hands, integrating them into the game and VR world in real-time.
The demo that Intel put on during the keynote definitely showed the promise, but the implementation was clunky and less than what I expected from the company. Real hands just showed up in the game, rather than representing the hands with rendered hands that track accurately, and it definitely put a schism in the experience. Obviously it’s up to the application developer to determine how your hands would actually be represented, but it would have been better to show case that capability in the live demo.
Better than just tracking your hands, Project Alloy was able to track a dollar bill (why not a Benjamin Intel??!?) and use it to interact with a spinning lathe in the VR world. It interacted very accurately and with minimal latency – the potential for this kind of AR integration is expansive.
Those same RealSense cameras and data is used to map the space around you, preventing you from running into things or people or cats in the room. This enables the first “multi-room” tracking capability, giving VR/AR users a new range of flexibility and usability.
Though I did not get hands on with the Alloy prototype itself, the unit on-stage looked pretty heavy, pretty bulky. Comfort will obviously be important for any kind of head mounted display, and Intel has plenty of time to iterate on the design for the next year to get it right. Both AMD and NVIDIA have been talking up the importance of GPU compute to provide high quality VR experiences, so Intel has an uphill battle to prove that its solution, without the need for external power or additional processing, can truly provide the untethered experience we all desire.
Subject: Mobile | June 9, 2016 - 02:30 PM | Sebastian Peak
Tagged: Snapdragon 652, smartphone, project tango, phablet, PHAB2, Lenovo, augmented reality, AR
Lenovo has unveiled the PHAB2 family at their Lenovo Tech World event today, featuring the PHAB2 Pro, a phablet-sized mobile device powered by Google's Project Tango (now simply Google Tango) augmented-reality technology.
Lenovo PHAB2 Pro (Image credit: Lenovo)
“Unlike any other phone, the PHAB2 Pro, powered by Tango technology – a set of sensors and software from Google that senses and maps its surroundings – makes a host of cutting-edge smartphone augmented reality (AR) experiences possible. For example, using AR apps, students can place true-to-scale virtual dinosaurs in their classrooms and enhance their learning through AR data overlays that appear while they walk around the creatures. AR gaming experiences let you play virtual dominos on your kitchen table, raise a digital pet in your bedroom and fight back swarms of aliens invading your house.
With Tango technology PHAB2 Pro can even begin to change the way people think about mapping indoor spaces to create new experiences like future augmented reality museum tours via the GuidiGO app. With Tango, PHAB2 Pro offers unprecedented experiences on a smartphone that will continually learn and improve.”
The large phablet devices are full smartphones, not just small tablets, and the three models offer widely varying specs with significant improvements in each successive model. We'll begin by looking at the base configuration.
The PHAB2 (Image credit: Lenovo)
- Display: 6.4-inch HD (1280x720) IPS
- Processor: MediaTek MTK 8735 Quad-Core Processor
- Memory: 3 GB
- Storage: 32 GB (expandable up to 128 GB via microSD)
- Sound: Triple Array Microphone with
- Active Noise-Cancellation; Dolby Atmos + Dolby Audio Capture 5.1
- Rear: 13 MP PDAF Fast-Focus
- Front: 5 MP 85° Wide Angle
- Battery: 4050 mAh
Next up is the PHAB2 Plus, which improves on the base model's display, SoC, and particularly the cameras:
“The PHAB2 Plus comes with two 13MP rear cameras that have instant focus, fast f1.8 lenses and the same professional-grade Futjitsu Milbeaut image signal processor that powers the Leica camera.”
Lenovo PHAB2 Plus
The PHAB2 Plus (Image credit: Lenovo)
- Display: 6.4-inch FHD (1920x1080) IPS
- Processor: MediaTek MTK 8783 Octa-Core Processor
- Memory: 3 GB
- Storage: 32 GB (expandable up to 128 GB via microSD)
- Sound: Triple Array Microphone with Active Noise-Cancellation; Dolby Atmos + Dolby Audio Capture 5.1
- Rear: 13 MP Dual Camera Milbeaut ISP, F2.0 Aperture, 1.34 Big Pixel, Laser Focus with PDAF Light Supplement
- Front: 8 MP Fixed-Focus, F2.2 Aperture, 1.4 μm Big Pixel, Light Supplement
- Battery: 4050 mAh
Next up we have the PHAB2 Pro, the flagship of the lineup, which moves to a Qualcomm Snapdragon SoC from the MediaTek chips in the first two phones, and offers a higher screen resolution and (most importantly for this launch) Google Tango support - the first product equipped with this AR technology.
Lenovo PHAB2 Pro
The PHAB2 Pro (Image credit: Lenovo)
- Display: 6.4-inch QHD (2560x1440) IPS Assertive Display
- Processor: Qualcomm Snapdragon 652 Processor Built for Tango
- Memory: 4 GB
- Storage: 64 GB (expandable up to 128 GB via microSD)
- Sound: Triple Array Microphone with Active Noise-Cancellation; Dolby Atmos + Dolby Audio Capture 5.1
- Rear: 16 MP PDAF Fast-Focus, Depth Sensor for Tango, Motion Tracking Sensor for Tango
- Front: 8 MP Fixed-Focus, F2.2 Aperture, 1.4 μm Big Pixel
- Battery: 4050 mAh
There will be a retail presence in the U.S. for the PHAB2 Pro, with Best Buy confirmed as an outlet for the Google Tango device. Additionally, in a move that is perplexing at first, the PHAB2 Pro will be featured for sale in Lowe's home improvement stores. (Wait, what?) A move which actually makes sense once you’ve read Lenovo’s press release:
“Homeowners can also now use their PHAB2 Pro to remodel their homes by visualizing real home furnishings in their living rooms and kitchens. Home improvement company Lowe’s is one of the first partners to develop a Tango-enabled application, Lowe’s Vision. The app empowers customers by leveraging Tango technology to measure spaces and visualize how products like appliances and décor, or materials like countertops or backsplash tile, will all look and fit together in a room. With Lowe’s Vision, customers will be able to control a new generation of augmented reality tools with a mere tap of the finger.”
As to pricing, the base PHAB2 has an MSRP of $199, the PHAB2 Plus moves up to $299, and the PHAB2 Pro will be $499. Availability set for September of this year.
Introduction, Virtual Insanity and Game of Making Games panels
Our second day at Quakecon 2012 started bright and early with expert panel discussions led by some of the gaming industry's elite game designers and programmers from around the globe. These panel discussions focused primary around the process different game studios go through to produce triple AAA titles and current developments in virtual reality headset technology. There was also more discussions about creating mods for games like Elder Scrolls V: Skyrim and utilizing modding communities as resources to produce higher quality games.
In between panel discussions, Quakecon hosted the first round of their annual Bawls chugging competition. BYOC gamers and event attendees were also able to try out a few game demos of Smite, Rise of the Triad, Dishonored, and Doom 3 BFG Edition. There were also several "quick draw" Quake Live matches to give out raffle tickets for a chance to win a new 2012 Ford Shelby GT500 Coupe.