Subject: General Tech | October 6, 2016 - 11:37 PM | Tim Verry
Tagged: supercomputer, microsoft, deep neural network, azure, artificial intelligence, ai
Microsoft recently announced it would be restructuring 5,000 employees as it focuses its efforts on artificial intelligence with a new AI and Research Group. The Redmond giant is pulling computer scientists and engineers from Microsoft Research, the Information Platfrom, Bing, and Cortana groups, and the Ambient Computing and Robotics teams. Led by 20 year Microsoft veteran Harry Shum (who has worked in both research and engineering roles at Microsoft), the new AI team promises to "democratize AI" and be a leader in the field with intelligent products and services.
It seems that "democratizing AI" is less about free artificial intelligence and more about making the technology accessible to everyone. The AI and Research Group plans to develop artificial intelligence to the point where it will change how humans interact with their computers (read: Cortana 2.0) with services and commands being conversational rather than strict commands, new applications baked with AI such as office and photo editors that are able to proof read and suggest optimal edits respectively, and new vision, speech, and machine analytics APIs that other developers will be able to harness for their own applications. (Wow that's quite the long sentence - sorry!)
Further, Microsoft wants to build the world's fastest AI supercomputer using its Azure cloud computing service. The Azure-powered AI will be available to everyone for their applications and research needs (for a price, of course!). Microsoft certainly has the money, brain power, and computing power to throw at the problem, and this may be one of the major areas where looking to "the cloud" for a company's computing needs is a smart move as the up front capital needed for hardware, engineers, and support staff to do something like this in-house would be extremely prohibative. It remains to be seen whether Microsoft will win out in the wake of competitors at being the first, but it is certainly staking its claim and does not want to be left out completely.
“Microsoft has been working in artificial intelligence since the beginning of Microsoft Research, and yet we’ve only begun to scratch the surface of what’s possible,” said Shum, executive vice president of the Microsoft AI and Research Group. “Today’s move signifies Microsoft’s commitment to deploying intelligent technology and democratizing AI in a way that changes our lives and the world around us for the better. We will significantly expand our efforts to empower people and organizations to achieve more with our tools, our software and services, and our powerful, global-scale cloud computing capabilities.”
Interestingly, this announcement comes shortly after a previous announcement that industry giants Amazon, Facebook, Google-backed DeepMind, IBM, and Microsoft founded the not-for-profit Partnership On AI organization that will collaborate and research best practices on AI development and exploitation (and hopefully how to teach them not to turn on us heh).
I am looking forward to the future of AI and the technologies it will enable!
Subject: General Tech | October 4, 2016 - 02:29 PM | Jeremy Hellstrom
Tagged: microsoft, server 2016
Ars Technica have put together an overview of the new Windows Server, three pages which broadly cover the new features you will find. As has often been discussed there will be three ways of installing the new Server OS, the familiar Desktop experience as well as Core and Nano. Nano is similar to the Core installation which we saw introduced in Server 2012 but further reduces the interface and attack surface by removing the last remnants of the GUI, no support for 32bit apps and the Microsoft installer; all you get is a basic control console. The Core and Desktop versions remain the same as in the 2012 version.
If you are curious about the inclusion of Docker features such as the Linux-like containers and changes to Hyper-V or deployment techniques drop by for a read.
"Like a special breed of kaiju, Microsoft's server platform keeps on mutating, incorporating the DNA of its competitors in sometimes strange ways. All the while, Microsoft's offering has constantly grown in its scope, creating variants of itself in the process."
Here is some more Tech News from around the web:
- Memristor behaves like a synapse @ Nanotechweb
- Google Fiber Is Now a Fiber and Wireless ISP @ Slashdot
- A year living with the Nexus 5X – the good, the bad, and the Nougat @ The Register
- Researchers Develop System To Send Passwords, Keys Through Users' Bodies @ Slashdot
- Apple takes tips from Microsoft as macOS Sierra becomes an automatic download @ The Inquirer
- Microsoft Azure sets up shop in France @ The Register
Subject: General Tech | October 3, 2016 - 01:27 PM | Jeremy Hellstrom
Tagged: Windows 7, windows 10, microsoft, market share
A change of one percent may seem tiny at first glance but historically it is an incredibly large shift in market share for an operating system. Unfortunately for Microsoft it is Windows 7 which has gained share, up to 48.27% of the market with Windows 10 dropping half a point to 22.53% while the various flavours of Windows 8 sit at 9.61%. This would make it almost impossible for Microsoft to reach their goal of
two one billion machines running Windows 10 in the two years after release and spells bad news for their income from consumers.
Enterprise have barely touched the new OS for a wide variety of reasons, though companies still provide significant income thanks to corporate licenses for Microsoft products and older operating systems. It should be very interesting to see how Microsoft will react to this information, especially if the trend continues. The sales data matches many of the comments we have seen here; the changes which they made were not well received by their customer base and the justifications they've used in the design of the new OS are not holding water. It shouldn't be long before we here more out of Redmond, in the mean time you can pop over to The Inquirer to see Net Applications' data if you so desire.
"The latest figures from Net Applications’ Netmarketshare service show Windows 7, now over seven years old, gain a full percentage point to bolster its place as the world’s most popular desktop operating system with 48.27 per cent (+1.02 on last month)."
Here is some more Tech News from around the web:
- HUDWAY Glass Head-Up Display Review @ NikKTech
- AMD prepares Zen for CES 2017 launch; aggressively clearing inventory for platform transition @ DigiTimes
- How to steal the mind of an AI: Machine-learning models vulnerable to reverse engineering @ The Register
- Linus Torvalds Officially Announces the Release of Linux Kernel 4.8 @ Slashdot
- Security analyst says Yahoo!, Dropbox, LinkedIn, Tumblr all popped by same gang @ The Register
- Source code for 'record-breaking' Mirai IoT botnet released online @ The Inquirer
- iPhone 7 Finishes Last In New Test of Battery Life @ Slashdot
Subject: General Tech | September 30, 2016 - 10:07 PM | Scott Michaud
Tagged: microsoft, windows 10
I've been seeing a lot of people discussing how frequently Windows 10 seems to be getting updated. This discussion usually circles back to how many issues have been reported with the latest Anniversary Update, and how Microsoft has been slow in rolling it out. The thing is, while the slow roll-out is interesting, the way Windows 10 1607 is being patched is not too unusual.
The odd part is how Microsoft has been releasing the feature updates, themselves.
In the past, Microsoft has tried to release updates on the second Tuesday of every month. This provides a predictable schedule for administrators to test patches before deploying them to an entire enterprise, in case the update breaks something that is mission-critical. With Windows 10, Microsoft has declared that patches will be cumulative and can occur at any time. This led to discussion about whether or not “Patch Tuesday” is dead. Now, a little over a year has gone by, and we can actually quantify how the OS gets updated.
There seems to be a pattern that starts with each major version release, which has (thus far) been builds 10240, 10586, and 14393. Immediately before and after these builds start to roll out to the public, Microsoft releases a flurry of updates to fix issues.
For instance, Windows 10 version 1507 had seven sub-versions of 10240 prior to general release, and five hotfixes pushed down Windows Update within the first month of release. The following month, September 2015, had an update on Patch Tuesday, as well as an extra one on September 30th. The following month also had two updates, the first of which on October's Patch Tuesday. It was then patched once for every following Patch Tuesday.
The same trend occurred with Build 10586 (Windows 10 version 1511). Microsoft released the update to the public on November 12th, but pushed a patch through Windows Update on November 10th, and five more over Windows Update in the following month-and-a-bit. It mostly settled down to Patch Tuesday after that, although a few months had a second hotfix sometime in the middle.
We are now seeing the same trend happen with Windows 10 version 1607. Immediately after release, Microsoft pushed a bunch of hotfixes. If history repeats itself, we should start to see about two updates per month for the next couple of months, then we will slow down to Patch Tuesday until Redstone 2 arrives sometime in 2017.
So, while this seems to fit a recurring trend, I do wonder why this trend exists.
Part of it makes sense. When Microsoft is developing Windows 10, it is trying to merge additions from a variety of teams into a single branch, and do so once or twice each year. This likely means that Microsoft has a “last call” date for these teams to merge their additions into the public branch, and then QA needs to polish this up for the general public. While they can attempt to have these groups check in mid-way, pushing their work out to Windows Insiders in a pre-release build, you can't really know how the final build will behave until after the cut-off.
At the same time, the massive flood of patches within the first month would suggest that Microsoft is pushing the final build to the public about a month or two too early. If this trend continues, it would make the people who update within the first month basically another ring of the Insider program. The difference is that it is less out-in, because you get it when Windows Update tells you to.
It will be interesting to see how this continues going forward, too. Microsoft has already delayed Redstone 2 until 2017, as I mentioned earlier. This could be a sign that Microsoft is learning from past releases, and optimizing their release schedule based on these lessons. I wonder how soon before release will Microsoft settle on a “final build” next time. It seems like Microsoft could avoid many stability problems by simply setting an earlier merge date, and aggressively performing QA for a longer period until it is released to the public.
Or I could be completely off. What do you all think?
Subject: General Tech | September 27, 2016 - 02:41 AM | Scott Michaud
Tagged: windows 10, virtualization, microsoft
Microsoft is currently hosting their Ignite conference, which is somewhat the successor of TechEd. Monday kicked off with a couple of keynotes, including one from Satya Nadella himself, but this post will focus on a specific announcement: Windows Defender Application Guard.
With a typical web browser, a malicious website can infect the user's PC by knowing an unpatched vulnerability, and exploiting it before they update their browser. The next feature release of Windows 10 is expected to include virtualization technology, again called Windows Defender Application Guard, which runs websites in a lightweight virtual machine if they are opened in Edge and not part of a whitelist. This means that the attacker, who wants to infect the user's device, not only needs to know of a vulnerability in Edge; they also need to know of a vulnerability in the virtual machine, and they must be able to use the Edge vulnerability to exploit it. Especially for enterprise environments, where ransom malware that encrypts any data it finds can be devastating, this should add a huge wall protecting a large, complex application platform (the web browser) from untrusted third-parties (websites).
Of course, this concept isn't new. Not only are virtual PCs are common in the enterprise for security and control reasons, but applications like SandboxIE have more directly implemented similar ideas. Still, having it be a built-in feature of the operating system should mean that it gets even more support with regards to performance and stability, versus tacking on a third-party solution through public APIs.
Speaking of public APIs -- Microsoft won't be providing one at first. It will only be used for Edge for the time being. Also, it's only available for Windows 10 Enterprise, so I hope you didn't get your hopes up.
Wow, that turned dark real quick.
Subject: General Tech | September 22, 2016 - 12:39 PM | Jeremy Hellstrom
Tagged: Intel, Lenovo, linux, signature edition, microsoft
Yesterday we saw the first stories appear about how the malware free Lenovo Signature Editions of mobile devices such as the Yoga 900S and Yoga 710S blocked the installation of Linux and effigies of Microsoft and Lenovo were set afire. As is common on the interwebs, the true villain was not implicated until the excitable crowd ran off with their pitchforks and torches and let the rest of us research the issue and track it back to Intel.
The issue is that the Intel soft RAID present on these machines is not really compatible with Linux, quite a common issue unfortunately. Lenovo is not innocent in this however as thee have greatly exacerbated the issue by making it difficult to change your SATA from RAID to AHCI in the BIOS in Windows and impossible in a live boot of Linux. In order to change your SATA settings Lenovo has decided to let you relive the days of Windows XP, when you had to bash on F6 during the initial installation of Windows to let it know you had a special disk with drivers on it to enable AHCI or RAID mode. Even better, apparently you have to get in touch with Lenovo to get these drivers and they only work in Windows, of course.
So thanks to the lousy Linux support offered by Intel's soft RAID implementation you cannot install Linux on Signature Editions of some Yoga machines and if you have a need to set your SATA to AHCI, say because of Endpoint Encryption, you need to go through a process that went out with that OS Microsoft wants people to stop using. If you want to track back the reddit thread and the research that was done to determine the culprit, The Register has compiled a good reference.
"A Reddit thread this morning accuses Microsoft and Lenovo of conspiring to prevent the installation of non-Windows operating systems on the Chinese goliath's PCs at the firmware level. Linux fans vented on the message board about the difficulties of installing open-source distributions on certain Lenovo machines."
Here is some more Tech News from around the web:
- Magneto-resistant upstart Everspin gets itself into an IPO whizz @ The Register
- BT's Wi-Fi Extender works great – at extending your password to hackers @ The Register
- Microsoft unveils Nokia 216 feature phone @ DigiTimes
- TV industry gets its own 'dieselgate' over 'leccy consumption tests @ The Register
Subject: Graphics Cards | September 20, 2016 - 03:58 PM | Scott Michaud
Tagged: microsoft, xbox, xbox one, pc gaming, nvidia, GTX 1080, gtx 1070
NVIDIA has just announced that specially marked, 10-series GPUs will be eligible for a Gears of War 4 download code. This bundle applies to GeForce GTX 1080 and GeForce GTX 1070 desktop GPUs, as well as laptops which integrate either of those two GPUs. As always, if you plan on purchasing a GPU due to this bundle, make sure that the product page for your retailer mentions the bundle.
Also, through the Xbox Play Anywhere initiative, NVIDIA claims that this code can be used to play the game on Xbox One as well. Xbox Play Anywhere allows users to purchase a game on either of Microsoft's software stores, Xbox Store or Windows Store, and it will automatically count as a purchase for the cross-platform equivalent. It also has implications for cloud saves, but that's a story for another day.
The bundle begins today, September 20th. Gears of War 4 launches on October 11th.
Subject: General Tech | September 15, 2016 - 06:21 PM | Scott Michaud
Tagged: microsoft, windows 10, Windows Store
If you have developed a Win32 or .NET application, and are interested in publishing it for the Windows Store, then Microsoft has released a tool to translate from the one to the other. There are some obvious concerns about this, which I will discuss later in this post, but most of those are more relevant to society as a whole, versus a single person who writes an app. This used to be called Project Centennial, and it's designed to help users enter the UWP platform with little effort, using the APIs they are comfortable with.
The major concern (from a society standpoint) is that the whole reason why Microsoft doesn't deprecate Win32 is because there's too much of it in use. This conversion process forces the application to only be installed through sideloading, or by uploading it to Windows Store. This is much better than iOS and the now deprecated Windows RT, which don't allow sideloading content, but there's nothing preventing Microsoft from just killing sideloading in five, twenty, or a hundred years. Since that's the only way to express yourself through a native application without a license for Microsoft, you can see what could go wrong if a government tells them that encryption software needs to go away, or a civil rights group attempts to release a controversial work of art.
Again, as I said earlier, this is a society issue, though. For interested developers, the tool is a way to bring your old software to a new distribution method. People like Tim Sweeney will probably say “no thanks” for political reasons, but, if that's not a concern for you, the tool exists.
DesktopAppConverter is free on the Windows Store.
Subject: General Tech | September 7, 2016 - 09:18 PM | Scott Michaud
Tagged: sony, ps4, ps4 pro, microsoft, Project Scorpio, xbox
At today's media briefing event, Sony announced two new versions of their PlayStation 4 console. The first is not even given a new name; they are just referring to it as the “new slimmer and lighter PS4” in their marketing material. It replaces the current version with one that is about 30% smaller, 16% lighter, and 28% more power efficient, according to a press release provided by AMD.
This update will be sold for $299.99 USD ($379.99 CDN) starting on September 15th.
The main topic of discussion was the PlayStation 4 Pro, though. Like Microsoft is doing with Project Scorpio, Sony wants the PS4 Pro to be compatible with the same catalog of titles, but do so at higher resolution and color depths. Sony claims that this generation is basically maxing out what can be done with 1080p. PC developers do not seem to have a problem using performance for new features, but the point that development costs are quickly becoming the limiting factor is valid to some extent.
In terms of specifications, while the CPU got an unspecified speed bump, the main upgrade is a new GPU, which is rated at 4.2 TFLOPs. This is about 30% slower than Microsoft's announced Project Scorpio (6 TFLOPs) but it also will arrive a year sooner. Will this lead time matter, though? The software catalog is already being built up by both companies, and it has been since each console launched in 2013.
Did they ever explain the extra ring on the case?
Also, because Microsoft started with a weaker console, scaling up to 4K resolution should be easier for their game developers. Project Scorpio is about 4.6x faster than the Xbox One, and it intends to draw four times the number of pixels. The gap between the PS4 and the PS4 Pro is just 2.3x. That could be a problem for them. (Meanwhile, us PC gamers can strap multiple 10+ TFLOP GPUs together for true 4K at decent frame rates, but that's another discussion.)
Granted, theoretical is different than real-world. We'll need to re-evaluate the industry in a couple of years, once an appropriate amount of hindsight is available. Also, Sony claims that PlayStation VR will still be available for both consoles, and that it will be a good experience whatever you choose. This is clearly aimed at Microsoft requiring Project Scorpio for their upcoming VR initiative, although likely to prevent confusion in their own fan base, rather than prodding their competitor.
Again, the PlayStation 4 Pro is launching this year, November 10th, and is expected to retail for $399.99 USD ($499.99 CDN). It's not a big jump in performance, but it's also not a big jump in price, either. In fact, I would consider it priced low enough to question the value of the regular PS4, even at $299.
What are your thoughts? Is this actually priced too low for pro?
Subject: General Tech | August 23, 2016 - 12:40 PM | Jeremy Hellstrom
Tagged: hololens, microsoft, Tensilica, Cherry Trail, hot chips
Microsoft revealed information about the internals of the new holographic processor used in their Hololens at Hot Chips, the first peek we have had. The new headset is another win for Tensilica as they provide the DSP and instruction extensions; previously we have seen them work with VIA to develop an SSD controller and with AMD for TrueAudio solutions. Each of the 24 cores has a different task it is hardwired for, offering more efficient processing than software running on flexible hardware.
The processing power for your interface comes from a 14nm Cherry Trail processor with 1GB of DDR and yes, your apps will run on Windows 10. For now the details are still sparse, there is still a lot to be revealed about Microsoft's answer to VR. Drop by The Register for more slides and info.
"The secretive HPU is a custom-designed TSMC-fabricated 28nm coprocessor that has 24 Tensilica DSP cores. It has about 65 million logic gates, 8MB of SRAM, and a layer of 1GB of low-power DDR3 RAM on top, all in a 12mm-by-12mm BGA package. We understand it can perform a trillion calculations a second."
Here is some more Tech News from around the web:
- Fujitsu: Why we chose 64-bit ARM over SPARC for our exascale super @ The Register
- Deus Ex: Mankind Divided Now Bundled with Select AMD CPUs @ Guru of 3D
- Google begins posting Nexus images for the Android 7.0 Nougat update @ Ars Technica
- Your wget is broken and should DIE, dev tells Microsoft @ The Register
- Epic Games forum hack exposes 800,000 credentials @ The Inquirer
- Open Source Hardware Comes of Age @ Hardware Secrets
- Total War : Warhammer Giveaway Contest @ TechARP