Subject: General Tech | March 29, 2018 - 03:10 PM | Tim Verry
Tagged: project trillium, nvidia, machine learning, iot, GTC 2018, GTC, deep learning, arm, ai
During GTC 2018 NVIDIA and ARM announced a partnership that will see ARM integrate NVIDIA's NVDLA deep learning inferencing accelerator into the company's Project Trillium machine learning processors. The NVIDIA Deep Learning Accelerator (NVDLA) is an open source modular architecture that is specifically optimized for inferencing operations such as object and voice recognition and bringing that acceleration to the wider ARM ecosystem through Project Trillium will enable a massive number of smarter phones, tablets, Internet-of-Things, and embedded devices that will be able to do inferencing at the edge which is to say without the complexity and latency of having to rely on cloud processing. This means potentially smarter voice assistants (e.g. Alexa, Google), doorbell cameras, lighting, and security around the home and out-and-about on your phone for better AR, natural translation, and assistive technologies.
Karl Freund, lead analyst for deep learning at Moor Insights & Strategy was quoted in the press release in stating:
“This is a win/win for IoT, mobile and embedded chip companies looking to design accelerated AI inferencing solutions. NVIDIA is the clear leader in ML training and Arm is the leader in IoT end points, so it makes a lot of sense for them to partner on IP.”
ARM's Project Trillium was announced back in February and is a suite of IP for processors optimized for parallel low latency workloads and includes a Machine Learning processor, Object Detection processor, and neural network software libraries. NVDLA is a hardware and software platform based upon the Xavier SoC that is highly modular and configurable hardware that can feature a convolution core, single data processor, planar data processor, channel data processor, and data reshape engines. The NVDLA can be configured with all or only some of those elements and they can independently them up or down depending on what processing acceleration they need for their devices. NVDLA connects to the main system processor over a control interface and through two AXI memory interfaces (one optional) that connect to system memory and (optionally) dedicated high bandwidth memory (not necessarily HBM but just its own SRAM for example).
NVDLA is presented as a free and open source architecture that promotes a standard way to design deep learning inferencing that can accelerate operations to infer results from trained neural networks (with the training being done on other devices perhaps by the DGX-2). The project, which hosts the code on GitHub and encourages community contributions, goes beyond the Xavier-based hardware and includes things like drivers, libraries, TensorRT support (upcoming) for Google's TensorFlow acceleration, testing suites and SDKs as well as a deep learning training infrastructure (for the training side of things) that is compatible with the NVDLA software and hardware, and system integration support.
Bringing the "smarts" of smart devices to the local hardware and closer to the users should mean much better performance and using specialized accelerators will reportedly offer the performance levels needed without blowing away low power budgets. Internet-of-Things (IoT) and mobile devices are not going away any time soon, and the partnership between NVIDIA and ARM should make it easier for developers and chip companies to offer smarter (and please tell me more secure!) smart devices.
- NVDLA Primer
- Project Trillium: Machine Learning on ARM
- NVIDIA Announces DGX-2 with 16 GV100s & 8 100Gb NICs
- GTC 2018: NVIDIA Announces Volta-Powered Quadro GV100
- NVIDIA Teases Low Power, High Performance Xavier SoC That Will Power Future Autonomous Vehicles
- NVIDIA Launches Jetson TX2 With Pascal GPU For Embedded Devices
- ARM Announces Project Trillium, a New Dedicated AI Processing Family
Subject: General Tech | February 21, 2018 - 09:00 AM | Josh Walrath
Tagged: modem, Kigen, iSIM, iot, cortex, cellular, arm
Last year ARM went on a bit of a buying spree thanks to the financial help of its holding company, SoftBank. One of the companies that it scooped up was that of Simulity Labs for around 12 million pounds. The company was developing IoT security products based on eSIM technology and a robust OS that provides provisioning on a cellular network.
Many believe that the nearly ubiquitous cellular networks that surround us are the key to truly successful IoT products. There are massive cellular deployments around the world. It is a well regulated spectrum. Security through SIM cards is a well known and understood process. It is not impossible to break this security, but it is questionable if it is worth the time and effort to do so.
ARM has gone ahead and provided the means to productize and push this technology with the aim of providing a vast, secure IoT infrastructure that would be relatively easy to rollout with current cellular networks. There are multiple parts to this technology, but ARM is hoping to offer an all-in-one solution that would provide an inexpensive platform for OEMs and Mobile Network Operators (MNOs) to roll out products on.
Subject: General Tech | November 8, 2017 - 01:15 PM | Jeremy Hellstrom
Tagged: logitech, iot, harmony link
If you own a Logitech Harmony Link and registered it then you already know, but for those who did not receive the email you should know your device will become unusable in March. According to the information Ars Technica acquired, Logitech have decided not to renew a so called "technology certificate license" which will mean the Link will no longer work. It is not clear what this certificate is nor why the lack of it will brick the Link but that is what will happen. Apparently if you have a Harmony Link which is still under warranty you can get a free upgrade to a Harmony Hub; if your Link is out of warranty then you can get a 35% discount. Why exactly one would want to purchase another one of these devices which can be remotely destroyed is an interesting question, especially as there was no monthly contract or service agreement suggesting this was a possibility when customers originally purchased their device.
"Customers received an e-mail explaining that Logitech will "discontinue service and support" for the Harmony Link as of March 16, 2018, adding that Harmony Link devices "will no longer function after this date."
Here is some more Tech News from around the web:
- This could be our favorite gadget of 2017: A portable projector @ The Register
- 'How Chrome Broke the Web' @ Slashdot
- Don't worry about those 40 Linux USB security holes. That's not a typo @ The Register
- Flaw Crippling Millions of Crypto Keys Is Worse Than First Disclosed @ Slashdot
- Highly flexible organic flash memory for foldable and disposable electronics @ Phys.org
- KRACK whacked, media playback holes packed, other bugs go splat in Android patch pact @ The Register
- The Biggest Tech Fails of the Last Decade @ TechSpot
Subject: General Tech | October 20, 2017 - 02:24 PM | Jeremy Hellstrom
Tagged: security, Reaper, iot
There is another IoT botnet running rampant, with several million devices already infected inside over a million businesses and homes, according to the report over at The Inquirer. Experts are expecting the IoT_reaper to be worse than Mirai once it is activated as it is far more sophisticated than that botnet. Some time in the near future you can expect serious issues as routers, IP cameras and fridges start launching DDoS attacks. There is little that you can do at this point apart from ensuring your devices are patched and the firmware is up to date. You can get an idea of the scope of this botnet by following the link in the story.
"Check Point first unearthed the botnet, codenamed 'IoT_reaper', at the beginning of September and claims that, since, it's already enslaved millions of IoT devices including routers and IP cameras from firms including GoAhead, D-Link, TP-Link, Avtech, Netgear, MikroTik, Linksys and Synology."
Here is some more Tech News from around the web:
- Google faces $10k-a-day fines if it defies court order to hand over folks' private overseas email @ The Register
- Ubuntu 17.10 launches welcoming back the laughing GNOME @ The Inquirer
- Sid Meier's Civilization III Free @ Humble Bundle
- CableLabs, Cisco working on LTE-over-DOCSIS @ The Register
- Reolink 5MP Security Camera Review @ OCC
Subject: General Tech | July 20, 2017 - 03:50 PM | Jeremy Hellstrom
Tagged: iot, Devil's Ivy, cameras, security, gSOAP
gSOAP is a open-source code library which allows hardware to be configured and controlled via web connections and is used by hundreds of companies including Axis, Microsoft, IBM, Adobe and Xerox. It has a vulnerability which allows an attacker to trigger a stack overflow by sending a specific POST command over port 80 to a device, which in the case of cameras allows you to watch the live feed. The vulnerability was patched in an update to gSOAP so future products will not have this issue, however any camera built on that library which currently in use is vulnerable. The manufacturers would have to create an update to their own software and push it out to all the cameras currently in use to resolve this issue, and if there is one thing we know for sure about IoT products, it is that these patches do not tend to be created, let alone pushed out.
"Security researchers investigating internet-connected video cameras have uncovered a bug that could conceivably leave millions of devices open to easy pwnage."
Here is some more Tech News from around the web:
- Intel has 'eliminated' its entire wearables division @ The Inquirer
- Microsoft will support Windows 10 on Clover Trail after all (well, a bit) @ The Inquirer
- Ethereum Co-Founder Says Cryptocurrencies Are 'a Ticking Time Bomb' @ Slashdot
- The Kaspersky Palaeontology of Cybersecurity Conference @ TechARP
- Amazon Echo Show @ Hardware Secrets
- Apple hurls out patches for dozens of security holes in iOS, macOS @ The Register
Subject: General Tech, Graphics Cards | May 27, 2017 - 12:18 AM | Tim Verry
Tagged: vision fund, softbank, nvidia, iot, HPC, ai
SoftBank, the Tokyo, Japan based Japanese telecom and internet technology company has reportedly quietly amassed a 4.9% stake in graphics chip giant NVIDIA. Bloomberg reports that SoftBank has carefully invested $4 billion into NVIDIA avoiding the need to get regulatory approval in the US by keeping its investment under 5% of the company. SoftBank has promised the current administration that it will invest $50 billion into US tech companies and it seems that NVIDIA is the first major part of that plan.
NVIDIA's Tesla V100 GPU.
Led by Chairman and CEO Masayoshi Son, SoftBank is not afraid to invest in technology companies it believes in with major past acquisitions and investments in companies like ARM Holdings, Sprint, Alibaba, and game company Supercell.
The $4 billion-dollar investment makes SoftBank the fourth largest shareholder in NVIDIA, which has seen the company’s stock rally from SoftBank’s purchases and vote of confidence. The (currently $93) $100 billion Vision Fund may also follow SoftBank’s lead in acquiring a stake in NVIDIA which is involved in graphics, HPC, AI, deep learning, and gaming.
Overall, this is good news for NVIDIA and its shareholders. I am curious what other plays SoftBank will make for US tech companies.
What are your thoughts on SoftBank investing heavily in NVIDIA?
Subject: General Tech | May 10, 2017 - 04:44 PM | Jeremy Hellstrom
Tagged: fuschia, google, Android, iot
Fuchsia is still a work in progress which has been available on Github for a while now but we haven't really seen a demonstration of it in action. A Texan enthusiast has been working on creating one and you can take a peek at it in this video over at The Register. The tiny OS is design to run on almost anything, from smart light bulbs to phone and even full sized computers. It is based on BSD with additional resources developed at MIT and will be backwards compatible with current Android libraries.
"When Fuchsia broke cover last August, we noted the project's ambition. The presence of a compositor indicated it was capable of running on more than lightbulbs and routers, although the tiny new Magenta kernel also allows it go there too."
Here is some more Tech News from around the web:
- Nvidia GTC: A first look at Nvidia's new campus @ The Inquirer
- It's 2017 and Windows PCs are being owned by EPS files, webpages @ The Register
- Windows 10 Now On 500 Million Devices, Up By 200 Million in a Year @ Slashdot
- Persirai: Mirai-a-like malware is your latest IoT security worry @ The Inquirer
Subject: Systems | April 19, 2017 - 08:26 PM | Jeremy Hellstrom
Tagged: tinker board, iot, asus
The ASUS Tinker Board is a full system in a tiny form factor, similar to Raspberry Pi or Arduino's products to name a few competitors in the now busy market. At its heart is the Rockchip RK3288, four ARM Cortex-A17 CPU cores running at 1.8GHz with a Mali-T764 GPU at 600MHz. They are available now for slightly more than the announced $54.99 and will run a Debian based OS called ASUS TinkerOS.
Inside are an array of options for add-ins, including a 40-pin GPIO header, a 15-pin MIPI DSI and a15-pin MIPI CSI as well as a2-pin contact point for PWM or S/PDIF signals. Externally you will have four USB 2.0 ports, an HDMI and a 3.5mm audio jack to give you flexibility in how you utilize your Tinker Board. For connectivity there is a wired NIC as well as 802.11b/g/n WiFi and Bluetooth 4.0. You can read the full PR below.
Fremont, CA (April 19, 2017) -- ASUS, maker of the world’s best-selling, most award-winning motherboards, is excited to launch the ASUS Tinker Board in North America today. Imagine the freedom to make your ideas come alive, the ability to invent an IoT device for a connected home or just having fun creating an entertainment hub for the family or powering your DIY robot project at school. With Tinker Board, the possibilities to create personalized devices are endless. Tinker Board is a single-board computer (SBC), which makes it the ideal foundation for makers, hobbyists, educators, and electronic DIY enthusiasts to develop and build low-cost, great-performing computers.
Features & Functionality
ASUS Tinker Board offers class-leading performance, robust multimedia support, IoT connectivity, and enhanced DIY design and compatibility with a wide range of leading SBC chassis and accessories. The result is a near credit card sized computer that offers people the freedom to tinker and apply their ingenuity to create platforms for a wide variety of uses.
Key features of Tinker Board include:
- CPU: 1.8GHz Rockchip RK3288 SoC quad-core processor
- GPU: Mali-T764 GPU Video:
- HD/UHD video playback support – including H.264/H.265 decoding Audio: 192kHz/24-bit audio support
- Memory: 2GB of dual-channel LPDDR3
- Storage: Micro SD(TF) slot features SD 3.0 support
- Connectivity: Bluetooth° 4.0 + EDR and on-board 802.11b/g/n WiFi
- Networking: 1Gb Ethernet
- Ports: (4) USB2.0 ports, (1) HDMI 1.4 out port, (1) 3.5mm audio jack
- I/O Ports: (1) 40-pin GPIO interface header, (1) 15-pin MIPI DSI, (1) 15-pin MIPI CSI, (1) 2-pin contact point for PWM and S/PDIF signals
- Power: Suggested 5V/2A AC adaptor via the micro-USB port (power adaptor not included)
- OS: (Debian-based Linux) & Android Support
- Dimensions/Weight: 85.60mm x 56mm x 21mm, 45g without included heatsink
Subject: General Tech | March 16, 2017 - 12:51 PM | Jeremy Hellstrom
Tagged: iot, scary, scada, security, ics
The Register posted a cheerful article today, discussing the security of the other Internet of Things, which they have dubbed the Internet of Big Things. Botnets formed out of compromised toasters, refrigerators and webcams is one thing; taking over power stations and industrial equipment is quite another. Citizens of the Ukraine know the dangers all too well, having had their power grid taken offline once in 2015 and again more recently by nefarious means. Take a read through to learn about how vulnerabilities in systems such as the Industrial Control System and Supervisory Control and Data Acquisition could be used to cause significant harm, as well as a search engine reassuringly named Shodan.
"The Internet of Big Things exists because it makes perfect sense to have accessibility to equipment from afar. Industrial systems are complex, specialist items and for many such systems it’s common for there to be only a handful of qualified maintenance staff in the country, continent or world."
Here is some more Tech News from around the web:
- AMD Ryzen 5 Processor Family Introduction @ [H]ard|OCP
- Qualcomm doesn't want you to call its Snapdragon processors, er, processors @ The Inquirer
- Updategate: Latest Windows 10 build suggests background downloads are back @ The Inquirer
- Headphone batteries flame out mid-flight, ignite new Li-Ion fears @ The Register
- Microsoft new Surface Book enters mass production @ DigiTimes
- Microsoft's Slack-slapping 'Teams' slips into Office 365 @ The Register
- Corsair Lapdog Game Control Center @ Benchmark Reviews
Subject: General Tech, Processors | March 12, 2017 - 05:11 PM | Tim Verry
Tagged: pascal, nvidia, machine learning, iot, Denver, Cortex A57, ai
Measuring 50mm x 87mm, the Jetson TX2 packs quite a bit of processing power and I/O including an SoC with two 64-bit Denver 2 cores with 2MB L2, four ARM Cortex A57 cores with 2MB L2, and a 256-core GPU based on NVIDIA’s Pascal architecture. The TX2 compute module also hosts 8 GB of LPDDR4 (58.3 GB/s) and 32 GB of eMMC storage (SDIO and SATA are also supported). As far as I/O, the Jetson TX2 uses a 400-pin connector to connect the compute module to the development board or final product and the final I/O available to users will depend on the product it is used in. The compute module supports up to the following though:
- 2 x DSI
- 2 x DP 1.2 / HDMI 2.0 / eDP 1.4
- USB 3.0
- USB 2.0
- 12 x CSI lanes for up to 6 cameras (2.5 GB/second/lane)
- PCI-E 2.0:
- One x4 + one x1 or two x1 + one x2
- Gigabit Ethernet
The Jetson TX2 runs the “Linux for Tegra” operating system. According to NVIDIA the Jetson TX2 can deliver up to twice the performance of the TX1 or up to twice the efficiency at 7.5 watts at the same performance.
The extra horsepower afforded by the faster CPU, updated GPU, and increased memory and memory bandwidth will reportedly enable smart end user devices with faster facial recognition, more accurate speech recognition, and smarter AI and machine learning tasks (e.g. personal assistant, smart street cameras, smarter home automation, et al). Bringing more power locally to these types of internet of things devices is a good thing as less reliance on the cloud potentially means more privacy (unfortunately there is not as much incentive for companies to make this type of product for the mass market but you could use the TX2 to build your own).
Cisco will reportedly use the Jetson TX2 to add facial and speech recognition to its Cisco Spark devices. In addition to the hardware, NVIDIA offers SDKs and tools as part of JetPack 3.0. The JetPack 3.0 toolkit includes Tensor-RT, cuDNN 5.1, VisionWorks 1.6, CUDA 8, and support and drivers for OpenGL 4.5, OpenGL ES 3 2, EGL 1.4, and Vulkan 1.0.
The TX2 will enable better, stronger, and faster (well I don't know about stronger heh) industrial control systems, robotics, home automation, embedded computers and kiosks, smart signage, security systems, and other connected IoT devices (that are for the love of all processing are hardened and secured so they aren't used as part of a botnet!).
Interested developers and makers can pre-order the Jetson TX2 Development Kit for $599 with a ship date for US and Europe of March 14 and other regions “in the coming weeks.” If you just want the compute module sans development board, it will be available later this quarter for $399 (in quantities of 1,000 or more). The previous generation Jetson TX1 Development Kit has also received a slight price cut to $499.