Subject: General Tech | November 27, 2014 - 10:20 PM | Scott Michaud
Tagged: windows 10, windows, mkv, microsoft, hevc, h.265, flac
Native support for audio and video codecs is helpful for a platform. Software will be able to call upon the operating system's built-in functions, rather than integrating a solution. Of course, some will continue to roll their own, and that's fine, but it is obviously helpful for the foundation to have its own solution (especially in cases where licenses and royalties are required).
Windows 10 is expected to increase its platform support to include FLAC, MKV, and HEVC (h.265), and more may be coming. The tweet from Gabriel Aul suggests that this will be available starting in the next preview build, which will land in early 2015. Hopefully these additions include both encoding and decoding support, possibly allowing audio and video editors to take advantage of these formats.
The next build of Windows 10 Technical Preview is expected for early next year. The full OS is said to launch late that year.
Subject: General Tech | November 27, 2014 - 09:29 PM | Scott Michaud
Tagged: apple, safari, google, yahoo, bing, microsoft, mozilla
After Mozilla inked the deal with Yahoo, the eyes turned to Apple and its Safari browser. Currently, the default search engine is Google on both iOS and OSX, although Bing is the primary engine used for other functions, like Siri and Spotlight. Until early 2015, they are tied into a contract with Google for those two browsers, but who will get the new contract?
Apparently Yahoo and Microsoft have both approached the company for the position, and Apple is not ruling any of the three out. Probably the most interesting part is how Yahoo is genuinely taking the search business seriously. The deal with Mozilla is fairly long-term, and with Yahoo approaching Apple as well, it probably was not just charity on Mozilla's part because no-one else wanted to be Firefox's default. Yahoo would probably need some significant monetary backing for an Apple deal, which suggests the same for their deal with Mozilla.
If both Mozilla and Apple leave Google, it will take a significant chunk out of the search engine. Power users, like those who read this site, will likely be unaffected if they care, because of how low the barrier is to change the default search engine. On the other hand, even the most experienced user will often accept default settings until there is a reason to change. The winning party will need to have a good enough product to overcome that initial shock.
But the money will at least give them a chance when the decision comes into effect. That is, unless the barrier to changing default search engines is less than the barrier to changing default web browsers.
Google will always be default on Google Chrome.
Subject: General Tech | November 27, 2014 - 04:17 AM | Scott Michaud
Tagged: microsoft, windows, windows 10, windows 6.4
Windows Vista broke away from the NT 5.x version number that was shared between Windows 2000 and Windows XP. Since then, each major OS release from Microsoft has incremented the minor version by one: Windows 7 was 6.1, Windows 8 was 6.2, and Windows 8.1 was 6.3. The current Windows 10 previews register as Windows 6.4, but screenshots suggest that Microsoft is considering a bump to 10.0 before release.
Seriously, this time?
This leads to two discussions: “compatibility” and “why”.
First, because some applications query the Windows version number and adjust their behavior, there is some concern that 10.0 could lead to problems. For instance, if an installer checks that Windows' major version is 6, rather than at least 6, it could simply refuse to load (at least without compatibility mode). In fact, I remember Microsoft speaking about this issue back when Vista launched, saying that spoofing incorrect versions fixed (I believe) most problems. Peter Bright at Ars Technica notes that changes to application architecture, instituted with Windows 7, 8, and 8.1, makes this change more safe than when Vista bumped it to 6.x, for instance. Applications will be given an earlier version number unless they claim higher-level support in its manifest.
And then we get to the “Why”. There really isn't any reason to keep the version number in lockstep with the branding. It could be a sign that Microsoft is pushing for branding with this release, which makes sense. Windows 10, from a technical standpoint, is shaping up nicely (although I am still concerned about WinRT-based app sideloading). It would not surprise me if they would go this petty to further cement a good brand image.
It could be a good... start.
So this is what happens when you install pre-release software on a production machine.
Sure, I only trusted it as far as a second SSD with Windows 7 installed, but it would be fair to say that I immersed myself in the experience. It was also not the first time that I evaluated upcoming Microsoft OSes on my main machine, having done the same for Windows Vista and Windows 7 as both were in production. Windows 8 was the odd one out, which was given my laptop. In this case, I was in the market for a new SSD and was thus willing to give it a chance, versus installing Windows 7 again.
So far, my experience has been roughly positive. The first two builds have been glitchy. In the first three days, I have rebooted my computer more times than I have all year (which is about 1-2 times per month). It could be the Windows Key + Arrow Key combinations dropping randomly, Razer Synapse deciding to go on strike a couple of times until I reinstall it, the four-or-so reboots required to install a new build, and so forth. You then also have the occasional issue of a Windows service (or DWM.exe) deciding that it would max out a core or two.
But it is pre-release software! That is all stuff to ignore. The only reason I am even mentioning it is so people do not follow in my footsteps and install it on their production machines, unless they are willing to have pockets of downtime here or there. Even then, the latest build, 9879, has been fairly stable. It has been installed all day and has not given me a single issue. This is good, because it is the last build we will get until 2015.
What we will not ignore is the features. For the first two builds, it was annoying to use with multiple monitors. Supposedly to make it easier to align items, mouse cursors would remain locked inside each monitor's boundary until you provide enough velocity to have it escape to the next one. This was the case with Windows 8.1 as well, but you were given registry entries to disable the feature. Those keys did not work with Windows 10. But, with Build 9879, that seems to have been disabled unless you are currently dragging a window. In this case, a quick movement would pull windows between monitors, while a slow movement would perform a Snap.
This is me getting ready to snap a window on the edge between two monitors with just my mouse.
In a single build, they turned this feature from something I wanted to disable, to something that actually performs better (in my opinion) than Windows 7. It feels great.
Now on to a not-so-pleasant experience: updating builds.
Simply put, you can click "Check Now" and "Download Update" all that you want, but it will just sit there doing nothing until it feels like it. During the update from 9860 to 9879, I was waiting with the PC Settings app open for three hours. At some point, I got suspicious and decided to monitor network traffic: nothing. So I did the close app, open app, re-check dance a few times, and eventually gave up. About a half of an hour after I closed PC Settings the last time, my network traffic spiked to the maximum that my internet allows, which task manager said was going to a Windows service.
Shortly after, I was given the option to install the update. After finishing what I was doing, I clicked the install button and... it didn't seem to do anything. After about a half of an hour, it prompted me to restart my computer with a full screen message that you cannot click past to save your open windows - it is do it or postpone it one or more hours, there is no in-between. About another twenty minutes (and four-or-five reboots) after I chose to reboot, I was back up and running.
Is that okay? Sure. When you update, you clearly need to do stuff and that could take your computer several minutes. It would be unrealistic to complain about a 20-minute install. The only real problem is that it waits for extended periods of time doing nothing (measured, literally nothing) until it decides that the time is right, and that time is NOW! It may have been three hours after you originally cared, but the time is NOW!
Come on Microsoft, let us know what is going on behind the scenes, and give us reliable options to pause or suspend the process before the big commitment moments.
So that is where I am, one highly positive experience and one slightly annoying one. Despite my concerns about Windows Store (which I have discussed at length in the past and are still valid) this operating system seems to be on a great path. It is a work in progress. I will keep you up to date, as my machine is kept up to date.
Subject: General Tech | November 12, 2014 - 07:38 PM | Scott Michaud
Tagged: visual studio, microsoft
While this is significantly different from what we usually write about, I have a feeling that there is some overlap with our audience.
Update: If you use Visual Studio Express 2013, you may wish to uninstall it before installing Community. My experience seems to be that it thinks that both are installed to the same directory, and so uninstalling Express after installing Community will break both. I am currently repairing Community, which should fix it, but there's no sense for you to install twice if you know better.
Visual Studio Express has been the free, cut-down option for small and independent software developers. It can be used for commercial applications, but it was severely limited in many areas, such as its lack of plug-in support. Today, Microsoft announced Visual Studio Community 2013, which is a free version of Visual Studio that is equivalent to Visual Studio Professional 2013 for certain users (explained below). According to TechCrunch, while Visual Studio Express will still be available for download, Community is expected to be the version going forward.
Image Credit: Wikimedia (modified)
There are four use cases for Visual Studio Community 2013:
- To contribute to open-source projects (unlimited users)
- To use in a classroom environment for learning (unlimited users)
- To use as a tool for Academic research (unlimited users)
- To create free or commercial, closed-source applications (up to 5 users)
- You must be an individual or small studio with less than 250 PCs
- You must have no more than $1 million USD in yearly revenue
Honestly, this is a give-and-take scenario, but it seems generally positive. I can see this being problematic for small studios with 6+ developers, but they can (probably) still use Visual Studio Express 2013 Update 3 until it gets too old. For basically everyone else, this means that you do not need to worry about technical restrictions when developing software. This opens the avenue for companies like NVIDIA (Nsight Visual Studio Edition) and Epic Games (Unreal Engine 4) to deliver their plug-ins to the independent developer community. When I get a chance, and after it finishes installing, I will probably check to see if those examples already work.
Visual Studio Community 2013 Update 4 is available now at Microsoft's website.
Subject: General Tech | November 2, 2014 - 10:13 PM | Scott Michaud
Tagged: microsoft, onedrive, skydrive, cloud storage, subscription service, subscription
I guess if you are going to take a hit on the enthusiasts by offering a 1TB tier, then you might as well just go all the way. Microsoft has been rolling out an unlimited tier to their various subscription products, starting with Office 365 Home, Personal, and University. OneDrive for Business customers, who are currently limited to 1TB of total storage, will be granted the unlimited tier, starting with "First Release" customers in 2015. It will probably arrive to "Standard Release" customers a couple of weeks later.
The 1TB tier was not around too long. It launched to several different subscriptions in late April, starting at $5 per user per month. Now, the current cheapest option is $7 per user per month, but it comes with a license of Office 365 Personal. Note that the first three tiers, Home, Personal, and University, are each non-commercial licenses. The rapid increase in capacity could mean either that the original initiative was very successful at wooing new customers, or the exact opposite of that. It is even possible that unlimited was the original intent, but they arrived there by way of a 1TB plan, either to shake up competitors, to double-up on media attention, or simply to dip a toe in. Basically, they could have done this for any reason under the sun. We have no idea.
Unlimited storage in OneDrive for Office 365 Personal, Home, and University is currently available, starting at $7 per user per month. OneDrive for Business customers will need to wait until 2015.
Subject: General Tech | October 29, 2014 - 12:22 PM | Jeremy Hellstrom
Tagged: arm, microsoft, windows server
The Register does not specify which version this was, likely a recent but highly modified version, but Microsoft has demonstrated their Server OS running on ARM hardware. This will give them another inroad to low cost server builds which don't necessarily have Intel or AMD inside, as well as hedging their bets against Linux. Linux is already happily running on just about any hardware you could want, or will be soon and Microsoft is likely worried about losing share to the open source OS. It will be interesting to see what Microsoft can offer the price conscious shopper to convince them to spend the money on an OS license when Linux is free. The days when the older generations of techs who have grown up with large UNIX servers and through Microsoft replacing it are numbered and they have always been one of the obstacles for the growth of upstart young Linux. The Register also points to the possibility of it being an in house solution to keep the costs of maintaining Microsoft's Cloud applications.
"That's not a stunning feat: having developed Windows RT – a version of Windows 8 running on ARM chippery – Microsoft clearly has the know-how to get the job done. And it's not an indication that Microsoft intends to make Windows Server on ARM a product. It's just a test."
Here is some more Tech News from around the web:
- Windows 10 Gets a Package Manager For the Command Line @ Slashdot
- Microsoft shows off spanking Win 10 PCs, compute-tastic Azure @ The Register
- Microsoft Office for Android tipped to arrive in November @ The Inquirer
- Universal Translator @ MAKE:Blog
- Best travel gadgets 2014 @ The Inquirer
- Win an ASUS ROG Swift 144Hz G-Sync monitor @ KitGuru
Subject: General Tech | October 28, 2014 - 01:46 PM | Jeremy Hellstrom
Tagged: microsoft, win7, inevitable
It is official, at the end of this month consumers will no longer be able to get their hands on a machine with Windows 7 installed, unless they luck into a machine which has been sitting on the shelves for a while. If you buy through a corporate account you will still be able to order a machine with Win7 but that will be the only way to get your hands on the OS which is already almost impossible to find. That puts shoppers in a bit of a bind as Win10 will not arrive for a while yet which leaves Win 8.1 as your only Microsoft based OS. Of course there is always Linux, now that many games and distribution platforms such as Steam support the free OS it is a viable choice for both productivity and entertainment. You can get more details at Slashdot or vent your spleen in the comments section.
"This Friday is Halloween, but if you try to buy a PC with Windows 7 pre-loaded after that, you're going to get a rock instead of a treat. Microsoft will stop selling Windows 7 licenses to OEMs after this Friday and you will only be able to buy a machine with Windows 8.1. The good news is that business/enterprise customers will still be able to order PCs 'downgraded' to Windows 7 Professional."
Here is some more Tech News from around the web:
- SUSE Linux Enterprise 12 Debuts With 'Rock-Solid' Cloud Support @ Linux.com
- Microsoft brings the CLOUD that GOES ON FOREVER @ The Register
- QuarkXpress 2015 to launch early next year with 64-bit speed boost @ The Inquirer
- Lumia 830: Microsoft hopes to seduce with slim 'affordable' model @ The Register
Subject: General Tech, Graphics Cards | October 27, 2014 - 04:50 PM | Ryan Shrout
Tagged: xbox one, sony, ps4, playstation 4, microsoft, amd
A couple of weeks back a developer on Ubisoft's Assassin's Creed Unity was quoted that the team had decided to run both the Xbox One and the Playstation 4 variants of the game at 1600x900 resolution "to avoid all the debates and stuff." Of course, the Internet exploded in a collection of theories about why that would be the case: were they paid off by Microsoft?
For those of us that focus more on the world of PC gaming, however, the following week an email into the Giantbomb.com weekly podcast from an anonymous (but seemingly reliable) developer on the Unity team raised even more interesting material. In this email, despite addressing other issues on the value of pixel count and the stunning visuals of the game, the developer asserted that we may have already peaked on the graphical compute capability of these two new gaming consoles. Here is a portion of the information:
The PS4 couldn’t do 1080p 30fps for our game, whatever people, or Sony and Microsoft say. ...With all the concessions from Microsoft, backing out of CPU reservations not once, but twice, you’re looking at about a 1-2 FPS difference between the two consoles.
What's hard is not getting the game to render but getting everything else in the game at the same level of performance we designed from the start for the graphics. By the amount of content and NPCs in the game, from someone who witnessed a lot of optimizations for games from Ubisoft in the past, this is crazily optimized for such a young generation of consoles. This is really about to define the next-generation unlike any other game beforehand.
We are bound from the CPU because of AI. Around 50% of the CPU is used for the pre-packaged rendering parts..
So, if we take this anonymous developers information as true, and this whole story is based on that assumption, then have learned some interesting things.
- The PS4, the more graphically powerful of the two very similarly designed consoles, was not able to maintain a 30 FPS target when rendering at 1920x1080 resolution with Assassin's Creed Unity.
- The Xbox One (after giving developers access to more compute cycles previously reserved to Kinect) is within a 1-2 FPS mark of the PS4.
- The Ubisoft team see Unity as being "crazily optimized" for the architecture and consoles even as we just now approach the 1 year anniversary of their release.
- Half of the CPU compute time is being used to help the rendering engine by unpacking pre-baked lighting models for the global illumination implementation and thus the game is being limited by the 50% remaining performance power the AI, etc.
It would appear that just as many in the media declared when the specifications for the new consoles were announced, the hardware inside the Playstation 4 and Xbox One undershoots the needs of game developers to truly build "next-generation" games. If, as this developer states, we are less than a year into the life cycle of hardware that was planned for an 8-10 year window and we have reached performance limits, that's a bad sign for game developers that really want to create exciting gaming worlds. Keep in mind that this time around the hardware isn't custom built cores or using a Cell architecture - we are talking about very basic x86 cores and traditional GPU hardware that ALL software developers are intimately familiar with. It does not surprise me one bit that we have seen more advanced development teams hit peak performance.
If the PS4, the slightly more powerful console of the pair, is unable to render reliably at 1080p with a 30 FPS target, then unless the Ubisoft team are completely off the rocker in terms of development capability, the advancement of gaming on consoles would appear to be somewhat limited. Remember the specifications for these two consoles:
|PlayStation 4||Xbox One|
|Processor||8-core Jaguar APU||8-core Jaguar APU|
|Memory||8GB GDDR5||8GB DDR3|
|Graphics Card||1152 Stream Unit APU||768 Stream Unit APU|
|Peak Compute||1,840 GFLOPS||1,310 GFLOPS|
The custom built parts from AMD both feature an 8-core Jaguar x86 architecture and either 768 or 1152 stream processors. The Jaguar CPU cores aren't high performance parts: single-threaded performance of Jaguar is less than the Intel Silvermont/Bay Trail designs by as much as 25%. Bay Trail is powering lots of super low cost tablets today and even the $179 ECS LIVA palm-sized mini-PC we reviewed this week. And the 1152/768 stream processors in the GPU portion of the AMD APU provide some punch, but a Radeon HD 7790 (now called the R7 260X), released in March of 2013, provides more performance than the PS4 and the Radeon R7 250X is faster than what resides in the Xbox One.
If you were to ask me today what kind of performance would be required from AMD's current GPU lineup for a steady 1080p gaming experience on the PC, I would probably tell you the R9 280, a card you can buy today for around $180. From NVIDIA, I would likely pick a GTX 760 (around $200).
Also note that if the developer is using 50% of the CPU resources for rendering computation and the remaining 50% isn't able to hold up its duties on AI, etc., we likely have hit performance walls on the x86 cores as well.
Even if this developer quote is 100% correct that doesn't mean that the current generation of consoles is completely doomed. Microsoft has already stated that DirectX 12, focused on performance efficiency of current generation hardware, will be coming to the Xbox One and that could mean additional performance gains for developers. The PS4 will likely have access to OpenGL Next that is due in the future. And of course, it's also possible that this developer is just wrong and there is plenty of headroom left in the hardware for games to take advantage of.
But honestly, based on my experience with these GPU and CPU cores, I don't think that's the case. If you look at screenshots of Assassin's Creed Unity and then look at the minimum and recommended specifications for the game on the PC, there is huge, enormous discrepancy. Are the developers just writing lazy code and not truly optimizing for the hardware? It seems unlikely that a company the size of Ubisoft would choose this route on purpose, creating a console game that runs in a less-than-ideal state while also struggling on the PC version. Remember, there is almost no "porting" going on here: the Xbox One and Playstation 4 share the same architecture as the PC now.
Of course, we might just be treading through known waters. I know we are a bit biased, and so is our reader base, but I am curious: do you think MS and Sony have put themselves in a hole with their shortsighted hardware selections?
UPDATE: It would appear that a lot of readers and commentors take our editorial on the state of the PS4 and XB1 as a direct attack on AMD and its APU design. That isn't really the case - regardless of what vendors' hardware is inside the consoles, had Microsoft and Sony still targeted the same performance levels, we would be in the exact same situation. An Intel + NVIDIA hardware combination could just have easily been built to the same peak theoretical compute levels and would have hit the same performance wall just as quickly. MS and Sony could have prevented this by using higher performance hardware, selling the consoles at a loss out the gate and preparing each platform for the next 7-10 years properly. And again, the console manufacturers could have done that with higher end AMD hardware, Intel hardware or NVIDIA hardware. The state of the console performance war is truly hardware agnostic.
Subject: General Tech | October 9, 2014 - 01:31 PM | Jeremy Hellstrom
Tagged: microsoft, surface, fail
It seems that Microsoft might be catching on to something everyone else in the market knew when they first announced their first foray into hardware since the Zune; software companies shouldn't annoy their customer by competing with them. Ballmer originally tried to assuage companies like Acer by claiming that Surface was just a proof of concept, which was met by disbelief and after 3 iterations of Surface those doubts were proven to be justified. According to Microsoft the Surface 3 is a big hit overseas but as this is their first crack at those markets you can bet that the sales will follow the same precipitous drop we saw for the first Surface in North America.
The news from DigiTimes today is that Surface 3 will be the last generation of this hybrid tablet. It could be that Microsoft will now focus on their phones, much to the dismay of those who have used their phones though perhaps the remaining human assets from Nokia will bring forth a new generation of workable devices.
"Microsoft continues to see weak sales for its Surface Pro 3 tablet and is reportedly planning to cancel the product line since shipment performance has been far lower than expectations, according to sources from the upstream supply chain."
Here is some more Tech News from around the web:
- How to Build and Tune an Open Source 3D Printer on Linux @ Linux.com
- Windows 10 feedback: 'Microsoft, please do a deal with Google to use its browser' @ The Register
- Fail of the Week: Transparent Circuit Design is Clearly a Challenge @ Hack a Day
- LTE's backers vow to KILL OFF WI-FI and BLUETOOTH @ The Register
- So long, Rory: AMD board names Lisa Su president and CEO @ The Register
- Lisu Su promoted to AMD CEO as Rory Read steps down @ The Tech Report
- The TR Podcast 163: Windows goes to 10 and Maxwell does DSR