Subject: General Tech | August 19, 2016 - 01:40 AM | Scott Michaud
Tagged: mount & blade ii, taleworlds
Mount & Blade is a quite popular franchise in some circles. It is based around a fairly simple, but difficult to master combat system, which mixes melee, blocking, and ranged attacks. They are balanced by reload time (and sometimes accuracy) to make all methods viable. A 100 vs 100 battle, including cavalry and other special units, is quite unique. It is also a popular mod platform, although Warband's engine can be a little temperamental.
As such, there's quite a bit of interest for the upcoming Mount & Blade II: Bannerlord. The Siege game mode involves an attacking wave beating down a fortress, trying to open as many attack paths as possible, and eventually overrunning the defenders. The above video is from the defending perspective. It seems like it, mechanically, changed significantly from Warband, particularly the Napoleonic Wars DLC that I'm used to. In that mod, attackers spawn infinitely until a time limit is reached. This version apparently focuses on single-life AI armies, which Warband had as Commander Battles.
Hmm. Still no release date, though.
Subject: General Tech | August 18, 2016 - 02:20 PM | Jeremy Hellstrom
Tagged: Intel, joule, iot, IDF 2016, SoC, 570x, 550x, Intel RealSense
Intel has announced the follow up to Edison and Curie, their current SoC device, called Joule. They have moved away from the Quark processors they previously used to a current generation Atom. The device is designed to compete against NVIDIA's Jetson as it is far more powerful than a Raspberry Pi and will be destined for different usage. It will support Intel RealSense, perhaps appearing in the newly announced Project Alloy VR headset. Drop by Hack a Day for more details on the two soon to be released models, the Joule 570x and 550x.
"The high-end board in the lineup features a quad-core Intel Atom running at 2.4 GHz, 4GB of LPDDR4 RAM, 16GB of eMMC, 802.11ac, Bluetooth 4.1, USB 3.1, CSI and DSI interfaces, and multiple GPIO, I2C, and UART interfaces."
Here is some more Tech News from around the web:
- Microsoft Windows UAC can be bypassed for untraceable hacks @ The Inquirer
- Microsoft PowerShell Goes Open Source and Lands On Linux and Mac @ Slashdot
- The Witcher 3: Wild Hunt Enables NVIDIA Ansel Support For 3D Stereo Screenshots @ Techgage
- ech support scammers mess with hacker's mother, so he retaliated with ransomware @ The Register
- 90 per cent of people ignore security notices because their brains are too busy @ The Inquirer
- NikKTech With Kingston HyperX It's All About Speed Global Giveaway
Subject: General Tech | August 17, 2016 - 02:02 PM | Jeremy Hellstrom
Tagged: Enderal, SureAI, mod, skyrim
Enderal: The Shards of Order is a complete conversion of the Steam version of Skyrim into a completely new game in a brand new world. The mod is 8 GB in size and requires a separate launcher, both available at Enderal.com and you can expect between 30 to 100 hours playtime. You may remember this team from Nehrim, their previous total conversion mod for Oblivion. Rock, Paper, SHOTGUN have covered this mod previously, however until now it was only available in German. The full English version, including voice acting, is now complete and ready for you to dive into. You might want to consider unmodding your Skyrim to install the mod, it does create a copy of the Skyrim installation so you can restore your Thomas the Tank Engine mod once you are set up.
"A final version of Enderal: The Shards of Order has been completed and can be downloaded for free now. While ‘Enderal’ sounds like it could be something made by a United States pharmaceutical company, it is actually a massive total conversion mod for Skyrim, not just adding new weapons or turning it into a survival game, but creating a whole new RPG using the raw materials of its parent."
Here is some more Tech News from around the web:
- Final Fantasy 15 hands-on: Brave new direction or just pandering to fans? @ Ars Technica
- MOO! Master Of Orion Reboot Leaves Early Access @ Rock, Paper, SHOTGUN
- The Legend Of THQnix: Dead Publisher Is Back, Kinda @ Rock, Paper, SHOTGUN
- AMD & NVIDIA GPU VR Performance: Valve's Robot Repair @ [H]ard|OCP
- Sunless Sea’s Former Lead Now Writing For Stellaris @ Rock, Paper, SHOTGUN
- Hands On: Dawn Of War III @ Rock, Paper, SHOTGUN
- Battlefield 1 Open Beta starts from 31st August @ HEXUS
- BioShock Original And Remastered Graphics Comparison @ Rock, Paper, SHOTGUN
- Without Kojima, Metal Gear becomes a multiplayer zombie action game @ Ars Technica
- Beyond No Man's Sky: 10 of the best games still to come in 2016 @ The Inquirer
Subject: General Tech | August 17, 2016 - 12:41 PM | Jeremy Hellstrom
Tagged: nvidia, Intel, HPC, Xeon Phi, maxwell, pascal, dirty pool
There is a spat going on between Intel and NVIDIA over the slide below, as you can read about over at Ars Technica. It seems that Intel have reached into the industries bag of dirty tricks and polished off an old standby, testing new hardware and software against older products from their competitors. In this case it was high performance computing products which were tested, Intel's new Xeon Phi against NVIDIA's Maxwell, tested on an older version of the Caffe AlexNet benchmark.
NVIDIA points out that not only would they have done better than Intel if an up to date version of the benchmarking software was used, but that the comparison should have been against their current architecture, Pascal. This is not quite as bad as putting undocumented flags into compilers to reduce the performance of competitors chips or predatory discount programs but it shows that the computer industry continues to have only a passing acquaintance with fair play and honest competition.
"At this juncture I should point out that juicing benchmarks is, rather sadly, par for the course. Whenever a chip maker provides its own performance figures, they are almost always tailored to the strength of a specific chip—or alternatively, structured in such a way as to exacerbate the weakness of a competitor's product."
Here is some more Tech News from around the web:
- USB Implementers Forum introduces branding for safe USB-C charging @ The Inquirer
- Some Windows 10 Anniversary Update: SSD freeze @ The Register
- Intel Project Alloy: all-in-one VR headset takes aim at Google's Project Daydream @ The Inquirer
- Wanna build your own drone? Intel emits Linux-powered x86 brains for DIY flying gizmos @ The Register
- Intel's Optane XPoint DIMMs pushed back – source @ The Register
Subject: General Tech | August 16, 2016 - 08:10 PM | Scott Michaud
Vulkan-Tutorial.com, while not affiliated with The Khronos Group, is a good read to understand how the Vulkan API is structured. It is set up like the tutorials I followed when learning WebGL, which seems to make it quite approachable. I mean, we are still talking about the Vulkan API, which was in no way designed to be easy or simple, but introduction-level material is still good for developers of all skill level, unless they're looking for specific advice.
To emphasize what I mean by “approachable”, the tutorial even includes screenshots of Visual Studio 2015 at some points, to help Windows users set up their build environment. Like... step-by-step screenshots. This explanation is also accompanied by Linux instructions, although those use Ubuntu terminal commands.
Subject: General Tech, Processors, Displays, Shows and Expos | August 16, 2016 - 01:50 PM | Ryan Shrout
Tagged: VR, virtual reality, project alloy, Intel, augmented reality, AR
At the opening keynote to this summer’s Intel Developer Forum, CEO Brian Krzanich announced a new initiative to enable a completely untether VR platform called Project Alloy. Using Intel processors and sensors the goal of Project Alloy is to move all of the necessary compute into the headset itself, including enough battery to power the device for a typical session, removing the need for a high powered PC and a truly cordless experience.
This is indeed the obvious end-game for VR and AR, though Intel isn’t the first to demonstrate a working prototype. AMD showed the Sulon Q, an AMD FX-based system that was a wireless VR headset. It had real specs too, including a 2560x1440 OLED 90Hz display, 8GB of DDR3 memory, an AMD FX-8800P APU with R7 graphics embedded. Intel’s Project Alloy is currently using unknown hardware and won’t have a true prototype release until the second half of 2017.
There is one key advantage that Intel has implemented with Alloy: RealSense cameras. The idea is simple but the implications are powerful. Intel demonstrated using your hands and even other real-world items to interact with the virtual world. RealSense cameras use depth sensing to tracking hands and fingers very accurately and with a device integrated into the headset and pointed out and down, Project Alloy prototypes will be able to “see” and track your hands, integrating them into the game and VR world in real-time.
The demo that Intel put on during the keynote definitely showed the promise, but the implementation was clunky and less than what I expected from the company. Real hands just showed up in the game, rather than representing the hands with rendered hands that track accurately, and it definitely put a schism in the experience. Obviously it’s up to the application developer to determine how your hands would actually be represented, but it would have been better to show case that capability in the live demo.
Better than just tracking your hands, Project Alloy was able to track a dollar bill (why not a Benjamin Intel??!?) and use it to interact with a spinning lathe in the VR world. It interacted very accurately and with minimal latency – the potential for this kind of AR integration is expansive.
Those same RealSense cameras and data is used to map the space around you, preventing you from running into things or people or cats in the room. This enables the first “multi-room” tracking capability, giving VR/AR users a new range of flexibility and usability.
Though I did not get hands on with the Alloy prototype itself, the unit on-stage looked pretty heavy, pretty bulky. Comfort will obviously be important for any kind of head mounted display, and Intel has plenty of time to iterate on the design for the next year to get it right. Both AMD and NVIDIA have been talking up the importance of GPU compute to provide high quality VR experiences, so Intel has an uphill battle to prove that its solution, without the need for external power or additional processing, can truly provide the untethered experience we all desire.
Subject: General Tech | August 16, 2016 - 01:38 PM | Jeremy Hellstrom
Tagged: RRAM, flexible silicon
Flexible computers are quickly becoming more of a reality as researchers continue to find ways to make generally brittle components such as processors and memory out of new materials. This latest research has discovered new materials to construct RRAM which allow working memory to remain viable even when subjected to flex. Instead of using traditional CMOS they have found certain tungsten oxides which display all of the properties required for flexible memory. The use of those oxides is not new, however they came with a significant drawback; in order to fabricate the material you needed a larger amount of heat than for CMOS. Nanotechweb reports on new developments from a team led by James Tour of Rice University which have lead to a fabrication process which can take place at room temperature. Check out their article for an overview and link to their paper.
"Researchers in the US and Korea say they have developed a new way to make a flexible, resistive random access memory (RAM) device in a room-temperature process – something that has proved difficult to do until now."
Here is some more Tech News from around the web:
- Microsoft is going to roll up all your Windows 7 and 8.1 updates Windows 10-style @ The Inquirer
- TSMC to make chips for iPad coming out in 2017, says report @ DigiTimes
- Minecraft meets Oculus Rift VR on Windows 10 @ The Inquirer
- White hat pops Windows User Account Control with log viewer data @ The Register
- Microsoft: Why we had to tie Azure Stack to boxen we picked for you @ The Register
- Windows 10 Anniversary Update – Heaven for Power Users @ Hardware Secrets
- NVIDIA Pascal Mobile GPU Specifications & Details! @ TechARP
Subject: General Tech | August 16, 2016 - 03:00 AM | Ryan Shrout
Tagged: pro, mouse, logitech g, logitech, gaming
Readers of PC Perspective have noticed that in the last couple of years a very familiar name has been asserting itself again in the world of gaming peripherals. Logitech, once the leader and creator of the gaming-specific market with devices like the G15 keyboard, found itself in a rut and was being closed in on by competitors such as Razer, Corsair and SteelSeries. The Logitech G brand was born and a renewed focus on this growing and enthusiastic market took place. We have reviewed several of the company’s new products including the G933/633 gaming headsets, G402 mouse that included an accelerometer and the G29 racing wheel.
Today Logitech is announcing the Logitech G Pro Gaming Mouse. As the name would imply, this mouse is targeted at gamers that fancy themselves as professionals, or aspiring to be so. As a result, I imagine that many “normie” PC gamers will find the design, features and pricing to be attractive enough to put next to the keyboard on their desk. This is a wired-only mouse.
The design of the Pro Gaming Mouse is very similar to that of the Logitech G100s, a long running and very popular mouse with the professional community. It falls a bit on the small side but Logitech claims that the “small and nimble profile allows gamers of many different game types to play as precisely as possible.” It’s incredibly light as well – measuring in at just 83g!
This mouse has 6 programmable buttons, much less than some of the more extreme “gaming” mice on the market, all of which can be controlled through the Logitech Gaming Software platform. The on-board memory on the Pro allows gamers to configure the mouse on their own system and take those settings with them to competition or friends’ PCs without the need to re-install software.
RGB lights are of course included with the Pro mouse and I like the idea of the wrap around the sides and back of the mouse to add some flair to the design.
Logitech is using the PMW3366 sensor in the Pro Gaming Mouse, the same used in the G502, G900 and others. Though mouse sensors might be overlooked for their importance in a gaming, the PMW3366 optical sensor is known to deliver accurate translations from 200-12,000 DPI with no acceleration or smoothing integrated that might hinder the input from the gamer.
The buttons on the Logitech G Pro use a torsion spring system rated at 20 million clicks (!!) which works out to 25 kilometers of button travel for the life of the mouse. The spring system used is designed to minimize effort and distance required for button actuation.
All aspects of the mouse were built with gamers in mind and with Logitech’s in-house professional gamers at the design table. Everything from the plastic feel, size, weight, etc. The scroll wheel is optimized for gamer’s use, not productivity, while the braided cable prevents snags. And the best part? The Logitech G Pro Gaming Mouse is set to have an MSRP of just $69.
The full press release is after the break and we are due to have a Logitech G Pro Gaming Mouse in our hands later today. We will follow up with thoughts and impressions soon!
Subject: General Tech | August 15, 2016 - 12:22 PM | Jeremy Hellstrom
Tagged: google, wireless isp, LTE
The FCC bidding was not terribly exciting but the result was numerous companies buying up parts of the spectrum and more importantly to this post, the opening of 3550-3650 MHz band for anyone to use. The 3.5GHz band is already allocated to shipborne navigation and military radar systems, this will be a test of ability of computer systems to moderate interference instead of the blanket ban they have always relied on in the past.
Google is about to test that ability, they will be running a test in several US cities to check the propagation of the signal as well as any possible maritime or military interference from the broadcast. This could be a way to get high speed internet to the curb without requiring fibre optic runs and would also be compatible with LTE, if Google wanted to dip their toes into that market. You can read about the tests and where they will be happening over at Hack a Day.
"In a recently released FCC filing, Google has announced their experimental protocol for testing the new CBRS. This isn’t fast Internet to a lamp pole on the corner of the street yet, but it lays the groundwork for how the CBRS will function, and how well it will perform."
Here is some more Tech News from around the web:
- 7 reasons Windows XP refuses to die @ The Inquirer
- Native Skype for Windows Phone walked behind shed, shot heard @ The Register
- A Trove Of 3D Printer Filament Test Data @ Hack a Day
- Firefox 49 For Linux Will Ship With Plug-in Free Netflix, Amazon Prime Video Support @ Slashdot
- Adobe stops software licence audits in Americas, Europe @ The Register
Even before the formulation of the term "Internet of things", Steve Gibson proposed home networking topology changes designed to deal with this new looming security threat. Unfortunately, little or no thought is given to the security aspects of the devices in this rapidly growing market.
One of Steve's proposed network topology adjustments involved daisy-chaining two routers together. The WAN port of an IOT-purposed router would be attached to the LAN port of the Border/root router.
In this arrangement, only IOT/Smart devices are connected to the internal (or IOT-purposed) router. The idea was to isolate insecure or poorly implemented devices from the more valuable personal local data devices such as a NAS with important files and or backups. Unfortunately this clever arrangement leaves any device directly connected to the “border” router open to attack by infected devices running on the internal/IOT router. Said devices could perform a simple trace-route and identify that an intermediate network exists between it and the public Internet. Any device running under the border router with known (or worse - unknown!) vulnerabilities can be immediately exploited.
Gibson's alternative formula reversed the positioning of the IOT and border router. Unfortunately, this solution also came with a nasty side-effect. The border router (now used as the "secure" or internal router) became subject to all manner of man-in-the-middle attacks. Since the local Ethernet network basically trusts all traffic within its domain, an infected device on the IOT router (now between the internal router and the public Internet) can manipulate or eavesdrop on any traffic emerging from the internal router. The potential consequences of this flaw are obvious.
The third time really is the charm for Steve! On February 2nd of this year (Episode #545 of Security Now!) Gibson presented us with his third (and hopefully final) foray into the magical land of theory-crafting as it related to securing our home networks against the Internet of Things.