All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech, Mobile, Shows and Expos | October 6, 2013 - 01:14 PM | Scott Michaud
Tagged: mozilla, Mozilla Summit 2013
The second day of Mozilla Summit 2013 kicked off with three more keynote speeches, a technology fair, and two blocks of panels. After two days and about two dozen demos, several extremely experimental, I am surprised to only see one legitimate demo fail attempting to connect two 3D browser games in multiplayer over WebRTC… and that seemed to be the fault of a stray automatic Windows Update on the host PC.
Okay technically another demo “failed” because an audience member asked, from the crowd, to browse a Mozilla Labs browser prototype, Servo, to an arbitrary website which required HTTPS and causing the engine to nope. I do not count that one.
Lastly, we saw a demo of the APC Paper which is expected to lead Firefox OS into the desktop market. It is actually a little smaller than I expected from the pictures.
One more day before everyone heads home. So far not much has happened but I will keep you updated as things occur.
Subject: General Tech, Mobile, Shows and Expos | October 5, 2013 - 03:58 PM | Scott Michaud
Tagged: mozilla, Mozilla Summit 2013
I have volunteered with Mozilla starting about a month after I read the Windows Store certification requirements (prior to that I was ramping up development of modern apps). I am currently attending, due to that volunteer work, Mozilla Summit in Toronto. The first day, Friday, has been filled with keynotes including some partially-new announcements.
Mozilla has a number of branded elevator doors, signs, and carpets covering the hotel to promote the event for the attendees. Unfortunately, my hotel room was not in the tower this elevator serviced. Also unfortunate, I did not realize that until I was on said elevator at in the 27th floor. Moving between the first and 27th floors took all of about 5 seconds; popping my ears took longer. To be fair I was given correct directions by the hotel staff I just did not realize that the building was, in fact, multiple buildings and so my interpretation was off.
On to the important stuff: explosions! The second keynote contained high performance 3D browser games and, albeit less kablooieie, site personalization.
The latter we have talked about before. Mozilla is implementing interface elements in the browser for users to share demographic information with websites. They understand that advertising is how the web works and does not want it outright dead. They do believe (at least some) advertisers mine too much data from their users because they need to mine some data from their users. One-on-one conversation with a couple Mozilla staff somewhat confirms my suspicions that the initiative is to remove the temptation for just a little more data with homegrown solutions. This seems to be their last idea, however, given the discussion at the panel.
The former was an Unreal Engine demo on stage during the “Envisioned Future State” keynote. The presenter had several multi-kills with a rocket launcher. I should note the entire demo ran off of the file protocol so no internet connection was required. This was quite literally Unreal Tournament 3 running native to Firefox.
Well, I think that is it for today! A lot of information was released but I believe these were the top-two most interesting points.
Subject: General Tech, Shows and Expos | October 3, 2013 - 01:41 PM | Jeremy Hellstrom
Tagged: kingston hyper x, kingston, DOTA 2, competition
Fountain Valley, CA – October 3, 2013 − Kingston Technology Company Inc., the independent world leader in memory products, will soon begin two global competitions to further show its support and commitment to the eSports and the enthusiast community. The HyperX DotA 2 League features 16 of the world’s top professional DotA 2 gaming teams battling for a large cash prize. On October 7, HyperX will begin an open global overclocking competition. The finals for both competitions will be held during 2014 International CES® in Las Vegas, Nevada.
The HyperX DotA 2 League tournament begins later this month with 16 teams competing for a total of $50,000 (USD) in prize money. An additional $40,000 will be offered to cover flight and hotel for the top four teams that advance to battle each other in Las Vegas for the championship. Each match is a best-of-three maps and all matches will be broadcast live so fans can follow the progress of their favorite team. The format and complete competition details can be found here.
Working together with HWBOT, the premier informational website for overclockers and performance enthusiasts, contestants will compete to post the highest benchmarks for Maximum Memory Frequency, Super PI and Intel® XTU. Beginning October 7, there will be an open online qualifying competition lasting four weeks. Winners will be determined weekly with the five final contestants competing in January 2014 during CES. For the finals, components will be supplied by Kingston and its partners: ASUS, Cooler Master and Intel®. Complete rules can be found here.
“The HyperX 2013 DotA 2 tournament will be epic as the best professional gaming teams in the world battle each other and fans will be able to watch every minute live online,” said Annie Leung, HyperX global strategic marketing manager, Kingston. “We are also very excited to hold an overclocking competition globally to see how far HyperX memory can be pushed. Both events will be fun and exciting for gamers and enthusiasts.”
Please visit the Kingston HyperX Website for more information.
Kingston is celebrating 25 years in the memory industry. The company was founded on October 17, 1987, and has grown to become the largest third-party memory manufacturer in the world. The 25th anniversary video can be found here along with more information, including a timeline of Kingston's history. In addition, HyperX memory is celebrating its 10th anniversary. The first HyperX high-performance memory module was released in November 2002.
Subject: General Tech, Graphics Cards, Shows and Expos | September 25, 2013 - 05:23 PM | Scott Michaud
Tagged: radeon, R9 290X, R9, R7, GPU14, amd
The next generation of AMD graphics processors are being announced this afternoon. They carefully mentioned this event is not a launch. We do not yet know, although I hope we will learn today, when you can give them your money.
When you can, you will have five products to choose from:
- R7 250
- R7 260X
- R9 270X
- R9 280X
- R9 290X
AMD only provides 3D Mark Fire Strike scores for performance. I assume they are using the final score, and not the "graphics score" although they were unclear.
The R7 250 is the low end card of the group with 1GB of GDDR5. Performance, according to 3DMark scores (>2000 on Fire Strike), is expected to be about two-thirds of what an NVIDIA GeForce GTX 650 Ti can deliver. Then again, that card retails for about ~$130 USD. The R7 250 has an expected retail value of less than < $89 USD. This is a pretty decent offering which can probably play Battlefield 3 at 1080p if you play with the graphics quality settings somewhere around "medium". This is just my estimate, of course.
The R7 260X is the next level up. The RAM has been double over the R7 250 to 2GB of GDDR5 and its 3DMark score almost doubled, too (> 3700 on Fire Strike). This puts it almost smack dab atop the Radeon HD 6970. The R7 260X is about $20-30 USD cheaper than the HD 6970. The R7 is expected to retail for $139. Good price cut while keeping up to date on architecture.
The R9 270X is the low end of the high end parts. With 2GB of GDDR5 and a 3DMark Fire Strike score of >5500, this is aimed at the GeForce 670. The R7 270X will retail for around ~$199 which is about $120 USD cheaper than NVIDIA's offering.
The R9 280X should be pretty close to the 7970 GHz Edition. It will be about ~$90 cheaper with an expected retail value of $299. It also has a bump in frame buffer over the lower-tier R9 270X, containing 3GB of GDDR5.
Not a lot is known about the top end, R9 290X, except that it will be the first gaming GPU to cross 5 TeraFLOPs of compute performance. To put that into comparison, the GeForce Titan has a theoretical maximum of 4.5 TeraFLOPs.
If you are interested in the R9 290X and Battlefield 4, you will be able to pre-order a limited edition package containing both products. Pre-orders open "from select partners" October 3rd. For how much? Who knows.
We will keep you informed as we are informed. Also, the announcement is still going on, so tune in!
Subject: General Tech, Shows and Expos | September 23, 2013 - 09:38 PM | Scott Michaud
Tagged: JavaOne, JavaOne 2013, gpgpu
Are the enterprise users still here? Oh, hey!
GPU acceleration throws a group of many similar calculations at thousands of simple cores. Their architecture makes it very cheap and power efficient for the amount of work they achieve. Gamers, obviously, enjoy the efficiency at tasks such as calculating pixels on a screen or modifying thousands of vertex positions. This technology has evolved more generally than graphics. Enterprise and research applications have been taking notice over the years.
GPU discussion, specifically, starts around 16 minutes.
Java, a friend of scientific and "big-data" developers, is also evolving in a few directions including "offload".
IBM's CTO of Java, John Duimovich, discussed a few experiments they created when optimizing the platform to use new hardware. Sorting arrays, a common task, saw between a 2-fold and 48-fold increase of performance. Including the latency of moving data and initializing GPU code, a 32,000-entry array took less than 1.5ms to sort, compared to about 3ms on the CPU. The sample code was programmed in CUDA.
The goal of these tests is, as far as I can tell, to (eventually) automatically use specialized hardware for Java's many built-in libraries. The pitch is free performance. Of course there is only so much you can get for free. Still, optimizing the few usual suspects is an obvious advantage, especially if it just translates average calls to existing better-suited libraries.
Hopefully they choose to support more than just CUDA whenever they take it beyond experimentation. The OpenPOWER Consortium, responsible for many of these changes, currently consists of IBM, Mellanox, TYAN, Google, and NVIDIA.
Subject: Editorial, General Tech, Systems, Mobile, Shows and Expos | September 16, 2013 - 09:15 PM | Scott Michaud
Tagged: Steam Box, LinuxCon, Gabe Newell
Valve Software, as demonstrated a couple of days ago, still believe in Linux as the future of gaming platforms. Gabe Newell discussed this situation at LinuxCon, this morning, which was streamed live over the internet (and I transcribed after the teaser break at the bottom of the article). Someone decided to rip the stream, not the best quality but good enough, and put it on Youtube. I found it and embed it below. Enjoy!
Gabe Newell highlights, from the seventh minute straight through to the end, why proprietary platforms look successful and how they (sooner-or-later) fail by their own design. Simply put, you can control what is on it. Software you do not like, or even their updates, can be stuck in certification or even excluded from the platform entirely. You can limit malicious software, at least to some extent, or even competing products.
Ultimately, however, you limit yourself by not feeding in to the competition of the crowd.
If you wanted to get your cartridge made you bought it, you know, FOB in Tokyo. If you had a competitive product, miraculously, your ROMs didn't show up until, you know, 3 months after the platform holder's product had entered market and stuff like that. And that was really where the dominant models for what was happening in gaming ((came from)).
But, not too surprisingly, open systems were advancing faster than the proprietary systems had. There used to be these completely de novo graphics solutions for gaming consoles and they've all been replaced by PC-derived hardware. The openness of the PC as a hardware standard meant that the rate of innovation was way faster. So even though, you would think, that the console guys would have a huge incentive to invest in it, they were unable to be competitive.
Microsoft attempts to exert control over their platform with modern Windows which is met by a year-over-year regression in PC sales; at the same time, PC gaming is the industry hotbed of innovation and it is booming as a result. In a time of declining sales in PC hardware, Steam saw a 76% growth (unclear but it sounds like revenue) from last year.
Valve really believes the industry will shift toward a model with little divide between creator and consumer. The community has been "an order of magnitude" more productive than the actual staff of Team Fortress 2.
Does Valve want to compete with that?
This will only happen with open platforms. Even the consoles, with systems sold under parts and labor costs to exert control, have learned to embrace the indie developer. The next gen consoles market indie developers, prior to launch, seemingly more than the industry behemoths and that includes their own titles. They open their platforms a little bit but it might still not be enough to hold off the slow and steady advance of PC gaming be it through Windows, Linux, or even web standards.
Speaking of which, Linux and web standards are oft criticized because they are fragmented. Gabe Newell, intentionally or unintentionally, claimed proprietary platforms are more fragmented. Open platforms have multiple bodies push and pull the blob but it all tends to flow in the same direction. Proprietary platforms have lean bodies with control over where they can go, just many of them. You have a dominant and a few competing platforms for each sector: phones and tablets, consoles, desktops, and so forth.
He noted each has a web browser and, because the web is an open standard, is the most unified experience across devices of multiple sectors. Open fragmentation is small compared to the gaps between proprietary silos across sectors. ((As a side note: Windows RT is also designed to be one platform for all platforms but, as we have been saying for a while, you would prefer an open alternative to all RT all the time... and, according to the second and third paragraphs of this editorial, it will probably suffer from all of the same problems inherent to proprietary platforms anyway.))
Everybody just sort of automatically assumes that the internet is going to work regardless of wherever they are. There may be pluses or minuses of their specific environment but nobody says, "Oh I'm in an airplane now, I'm going to use a completely different method of accessing data across a network". We think that should be more broadly true as well. That you don't think of touch input or game controllers or living rooms as being things which require a completely different way for users to interact or acquire assets or developers to program or deliver to those targets.
Obviously if that is the direction you are going in, Linux is the most obvious basis for that and none of the proprietary, closed platforms are going to be able to provide that form of grand unification between mobile, living room, and desktop.
Next week we're going to be rolling out more information about how we get there and what are the hardware opportunities that we see for bringing Linux into the living room and potentially pointing further down the road to how we can get it even more unified in mobile.
Well, we will certainly be looking forward to next week.
Personally, for almost two years I found it weird how Google, Valve, and Apple (if the longstanding rumors were true) were each pushing for wearable computing, Steam Box/Apple TV/Google TV, and content distribution at the same time. I would not be surprised, in the slightest, for Valve to add media functionality to Steam and Big Picture and secure a spot in the iTunes and Play Store market.
As for how wearables fit in? I could never quite figure that out but it always felt suspicious.
Subject: Processors, Shows and Expos | September 10, 2013 - 02:47 PM | Ryan Shrout
Tagged: idf, idf 2013, Intel, keynote, live blog
We are preparing for the second day of keynotes at IDF so sign up below to get a reminder for our live blog! After the first keynote saw the introduction of Intel Quark SoCs, showcases of the first 14nm Broadwell processor and a 22nm LTE smartphone, day 2 could be even more exciting!
The event starts at 9am PT / 12pm ET on Wednesday the 11th!
Subject: Processors, Shows and Expos | September 10, 2013 - 02:31 PM | Ryan Shrout
Tagged: quark, Intel, idf 2013, idf
In a very interesting and surprising announcement at the first Intel Developer Forum keynote this morning, Intel's new CEO Brian Krzanich showed the first samples of Quark, a new SoC design that will enter into smaller devices that even Atom can reach.
Quark is the family name for the new line of SoCs that are open, synthesizable and support with industry standard software. An open SoC is simply one that will allow third-party IP integration with the processor cores while a synthesizable one can be moved and migrated to other production facilities as well. This opens up Intel to take Quark outside of its own fabrication facilities (though Krzanich said they would prefer not during Q&A) and allow partners to more easily integrate their own silicon with the Quark DNA. Intel had previously announced that Atom would be able to integrate with third-party IP but that seems to have been put on the back burner in favor of this.
Quark will not be an open core design in the same way that ARM's core can be, but instead Intel is opening up the interface fabric for direct connection to computing resources.
The Quark SoC is square in the middle
Krzanich showed off the chip on stage that is 1/5 the size of Atom and 1/10 the power levels of Atom (though I am not sure if we are referring to Clover Trail or Bay Trail for the comparison). That puts it in a class of products that only ARM-based designs have been able to reach until now and Intel displayed both reference systems and wearable designs.
UPDATE: Intel later clarified with me that the "1/5 size, 1/10 power" is for a Quark core against an Atom core at 22nm. It doesn't refer to the entire SoC package.
Intel hasn't yet told us what microarchitecture Quark is based on but if I were a betting man I would posit that it is related to the Silvermont designs we are looking at on Bay Trail but with a cut down feature set. Using any other existing design from Intel would result in higher than desired power consumption and die size levels but it could also be another ground up architecture as well.
I'll be poking around IDF for more information on Quark going forward but for now, it appears that Intel is firmly planting itself on a collision course with ARM and Qualcomm.
UPDATE 1: I did get some more information from Intel on the Quark SoC. It will be the first product based on the 14nm manufacturing process and is a 32-bit, single core, single thread chip based on a "Pentium ISA compatible CPU core." This confirms that it is an x86 processor though not exactly what CPU core it is based on. More soon!
Subject: Processors, Shows and Expos | September 10, 2013 - 11:02 AM | Ryan Shrout
Tagged: live blog, keynote, Intel, idf 2013, idf
UPDATE: You can see the replay of our live blog below from Day 1 of IDF but be sure you head over to the Day 2 Live Blog page to set a reminder! Join us on Wednesday at 9am PT / 12pm ET!!
Today is the beginning of the 2013 Intel Developer Forum in San Francisco! Join me at 9am PT for the first of three live blogs of the main Intel keynotes where we will learn what direction Intel is taking on many fronts!
Subject: General Tech, Systems, Shows and Expos | September 4, 2013 - 12:11 PM | Scott Michaud
Tagged: zenbook, ifa 2013, asus
How about some Ultrabooks? We got Ultrabooks. Thin, light, and metal brushed with their characteristic circular pattern. They are proud of that design, proud enough to cover it, the top of the lid, with a layer of Corning Gorilla Glass 3 to protect it from scratches and gouges. It feels a little absurd to say, but covering metal in glass for increased durability seems to make sense and could help your premium laptop look new for longer.
These 13.3-inch laptops come in two resolutions: 1920x1080 Full HD is the lower offering, with 2560x1440 for higher-end tastes. Both monitors are IPS-based with 10-point multi-touch capabilities.
The raw specifications for the UX301 are:
- Intel Core i5-4200U or i7-4500U or i7-4558U
- 4GB or 8GB DDR3L
- Intel HD 5100 Graphics
- SATA 3 SSD (up to 512GB RAID0)
- 13.3-inch 2560 x 1440 WQHD or 1920 x 1080 Full HD IPS multitouch
- 802.11ac (dual-band), Bluetooth 4.0
- Mini DisplayPort, Micro-HDMI 1.4
- 2x USB3.0, 3.5mm headphone/mic, SD card reader
- Windows 8 or Windows 8 Pro
For the UX302:
- Intel Core i5-4200U or i7-4500U
- 4GB DDR3L
- Intel HD 4400 or NVIDIA GeForce GT 730M (2GB)
- Up to a 750GB HDD with 16 GB SSD cache
- 1920 x 1080 Full HD IPS multitouch
- 802.11ac (dual-band), Bluetooth 4.0
- Mini Displayport, HDMI 1.4
- 3x USB 3.0, 3.5mm headphone/mic, SD card reader
- Windows 8 or Windows 8 Pro
The UX301 seems to be the more-premium device with obviously higher specs, at least for options, despite being almost 2mm thinner (15.5mm vs 17.2 for the UX302) and a quarter of a pound lighter (1.38kg vs 1.5 kg for the UX302). Both models are listed as a 50W battery, which I assume means 50Wh since watts are not a unit of electrical storage, but I am not entirely sure.
No information about pricing and availability has been released.
Subject: General Tech, Mobile, Shows and Expos | September 4, 2013 - 11:34 AM | Scott Michaud
Tagged: TF701T, ifa 2013, asus
Among the ASUS announcements is their new Transformer Pad TF701T. Being a Transformer Pad, the TF701T is an Android Tablet which can be used alone or docked in a keyboard for extra battery life and USB 3.0 support -- or, of course, for a keyboard. The touch display is IPS-based, 10.1", and with a native resolution of 2560x1600.
The other raw specifications include:
- NVIDIA Tegra 4 T40X quad-core SoC.
- 2GB DDR3L RAM
- WiFi B/G/N (dual-band) with Miracast support
- Bluetooth 3.0+EDR
- Speaker with ASUS "SonicMaster" technology
- 32GB and 64GB options
- MicroSDXC port on tablet if you need more storage.
- SDXC port on the dock if you need even more storage or, of course, to load pictures from a camera
- USB 3.0 (on dock).
- 3.5mm headphone/microphone jack.
I find it somewhat interesting that ASUS listed the Tegra 4 "T40X". It seems odd to declare a specific model if, unless I completely missed something, Tegra 4 is not announced to be binned in to multiple SKUs. This might suggest Tegra 4 will have more options than simply, "Get Tegra 4 or wait for Tegra 4i with the built-in Icera modem". Then again, it could be another case of over-description. Either way, it is something we will watch closely and report further on.
Pricing and availability information has not yet been released.
Subject: General Tech, Shows and Expos | August 23, 2013 - 11:31 PM | Scott Michaud
Tagged: battlefield 4, gamescom
Battlefield 4 is still at Gamescom, possibly the last big trade show before launch, and they are still trickling out news. DICE just released their Levelution teaser, which is a cheesy name and I will not pretend to ignore that, but an interesting premise for level design. Keeping in line with the original premise for Battlefield, levels are designed for too many options to fully consider, leading to the characteristic "Battlefield Moments".
A Battlefield Moment is when something amazing, genius, or plain stupid happens in a multiplayer match and draws your attention.
Some popular examples include:
- The airborne murdering of pilots.
- Hijacking jets semi-automatically.
- Getting to the choppah.
- Sure, you can die instead.
- Indecisive snipers.
Of course capturing these moments are much more common since Bad Company 2 just due to how common video editing is. Battlefield 2 also had its moments. A personal favorite is when I dropped anti-tank mines, while parachuting, atop a heavy tank. It turns out the time it took to fall was enough for it to arm. I also enjoy proclaiming myself "The Kool-Aid Man" before running someone in my Teamspeak channel down by crashing a jeep through a wall.
DICE wants to show off their engine technology by including many user-triggered events, big and small, to give players options. You can destroy cover, lower doors, and otherwise succeed with out-of-box ideas. Check out the video, above, for more examples.
I am pretty sure the marketing people like showing off video of a skyscraper crumbling, too.
Battlefield 4 will launch October 29th in North America.
Subject: General Tech, Shows and Expos | August 23, 2013 - 03:39 PM | Scott Michaud
Tagged: battlefield 4, gamescom
EA TV conducted a broadcast of Battlefield 4, spectating gameplay from the attendees of Gamescom. Almost 13 minutes of were posted to Youtube and, because our readers are awesome, you get to see it here. Destructibility exceeding even Bad Company 2 is displayed in this map, Paracel Storm, as Russia and China fight over an island cluster.
... And the observer tools look interesting, too!
The hook of the map, at least one of them, is a windmill out in the ocean anchoring a destroyed frigate. Once the windmill is destroyed the wreckage is released, runs aground near a control point, and destroys a multi-level structure.
The video also announced another gametype, "Obliteration", which seems somewhat like a play on the Rush mode from Bad Company and Assault from Halo 2. Each team has 3 "MCOM" points which can be destroyed by planting a randomly spawning bomb. An interesting mashup.
I am, still, slightly upset that we have yet to see free bombing since Battlefield 2. Rumors have circulated, due to the promotional images from China Rising DLC, that bombers will make a reappearance in, at least, that DLC. The mechanic of bombers allows you to either: in the sky, carpet bomb to clear predictable attack paths; on the ground, lure enemies into bombing paths and chat with teammates in the sky.
Of course this is less relevant outside of the clan skirmish sphere, but still fun with a handful of friends on a public server.
What are you thoughts on the video? What are you looking forward to about Battlefield 4? What are you hoping for in Battlefield 4?
Stay tuned in case anything more comes out from Gamescom. Battlefield 4 launches October 29th with a Beta coming earlier that month.
Subject: General Tech, Shows and Expos | August 7, 2013 - 03:52 PM | Scott Michaud
Tagged: DOTA 2, valve
Valve has just commenced, with the first match setting up and taking place (between teams Na'Vi and Orange) as I type, the third iteration of their giant DOTA 2 tournament: The International 3.
The prize pool, starting at 1.6 million dollars, is increased by $2.50 for each $9.99 tournament Compendum sold. Almost 500,000 were sold by this point yielding a purse of $2.84 million USD; the prize for first place, alone, is currently $1.42 million USD. This continues The International's trend of being among the most lucrative video game tournaments, almost doubling Blizzard's 2013 StarCraft II World Championship Series which are also along this weekend.
The live Twitch stream, ignoring fans at the venue and cheering from a variety of pubs, currently distributes to 154,000 viewers. Check out their website to watch live, check the schedule, and view past results. The tournament will continue until Sunday. The first match will begin just a minute or two after this publishes.
Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | July 24, 2013 - 05:15 PM | Scott Michaud
Tagged: Siggraph, kepler, mobile, tegra, nvidia, unreal engine 4
SIGGRAPH 2013 is wrapping up in the next couple of days but, now that NVIDIA removed the veil surrounding Mobile Kepler, people are chatting about what is to follow Tegra 4. Tim Sweeney, founder of Epic Games, contributed to NVIDIA Blogs the number of ways that certain attendees can experience Unreal Engine 4 at the show. As it turns out, NVIDIA engineers have displayed the engine both on Mobile Kepler as well as behind closed doors on desktop PCs.
Not from SIGGRAPH, this is a leak from, I believe, GTC late last March.
Also, this is Battlefield 3, not Unreal Engine 4.
Tim, obviously taking the developer standpoint, is very excited about OpenGL 4.3 support within the mobile GPU. In all, he did not say too much of note. They are targeting Unreal Engine 4 at a broad range of platforms: mobile, desktop, console, and, while absent from this editorial, web standards. Each of these platforms are settling on the same set of features, albeit with huge gaps in performance, allowing developers to focus on a scale of performance instead of a flowchart of capabilities.
Unfortunately for us, there have yet to be leaks from the trade show. We will keep you up-to-date if we find any, however.
Subject: General Tech, Systems, Shows and Expos | June 28, 2013 - 03:11 AM | Scott Michaud
Tagged: Windows 8.1, MakerBot, BUILD 2013, BUILD
Even Microsoft believes that 3D printing is a cool movement.
Windows 8.1 will include native support for the 3D printers, CNC machines, and laser cutting devices. According to a stage demo at BUILD, Microsoft expects printing in 3D will be as easy as printing in 2D. It might be hard to think of more than a few practical applications for a home user to have access to such hardware, but often people will not realize when they avoid what could have been easily solved with the right tools.
- A standardized driver model for 3D Printers, CNC machines, and laser cutting devices
- APIs for apps to interface with the above drivers
- Device apps and extensions through the Windows Store
- Job spooling and queuing
- Easy ways to query what the device and its capabilities
The reliance upon the Windows Store might tell the larger tale. It appears that Microsoft is giving the nod to the maker community, not out of excitement, but to enable app developers to interface with these devices. Could the "modern" Windows APIs provide enough flexibility for 3D printing apps to exist without Microsoft's support? What about the next classification of peripherals?
All pondering aside...
The demo involved Antoine Leblond, of Microsoft, printing a vase from a MakerBot Replicator 2. According to TechCrunch, MakerBot will not only invade Windows 8.1 but also be stocked at Microsoft Stores. This is a solid retail win for the maker movement, giving users a chance throw one of these in the back seat of the car and drive it home from the store.
Subject: General Tech, Mobile, Shows and Expos | June 28, 2013 - 02:05 AM | Scott Michaud
Tagged: BUILD, BUILD 2013, internet explorer, IE11, Windows 7
Windows 8.1 will be bundled with Microsoft's latest web browser, Internet Explorer 11. The line of browsers, starting with Internet Explorer 9, are very competent offerings which approach and eclipse many competitors. Microsoft has made some errors since then, breaking standards for personal gain, but their recent efforts in supporting W3C – and even arch-nemesis Khronos – displays genuine good faith.
HTML5 Developer Tools rivalling even Mozilla and Google
But Windows XP never surpassed Internet Explorer 8, and apart from glitch and vulnerability fixes, Windows 7 is in almost exactly the same state as the day Windows 8 shipped. Internet Explorer 10 made it to the platform, late and reluctantly, along with severely neutered back-ports of Windows 8 DirectX enhancements. The platform update was welcome, but lacks the importance of a full service pack.
More importantly, the hesitation to bring IE10 to Windows 7 suggested that it would be the last first-party web browser the platform would see.
Not true, apparently. During their Build conference, Engadget claims to have spoke with a Microsoft representative who confirmed Windows 7 will receive the latest Internet Explorer. This is good news for every user of IE and every web designer with a cool WebGL implementation but is held back by browser market share concerns.
Honestly, my main concern is with the future of Internet Explorer, 12 and beyond. I am encouraged by the recent effort by Microsoft, but with Windows RT demanding for every browser to be built atop Internet Explorer, it better keep up or we are screwed. The web browser might be our operating system in the near future, applications should not be held back by the least of all possible platforms – be it Internet Explorer, Webkit, or any other dominant browser.
Of course, I should note that Engadget was not being specific with their source, so some grain of salts would be wise until it ships.
Subject: General Tech, Shows and Expos | June 25, 2013 - 07:41 PM | Scott Michaud
Tagged: Windows 8.1, BUILD 2013, BUILD, Blue
June 26th will be the first time for the general public to, legitimately, experience the preview build of Windows 8.1, Windows Server 2012 R2, and the all blue'd-up Windows Server Essentials. This preview will be available through the Windows Store, but will not be upgradable to the final code.
Again, this preview will require, basically, Windows re-installation when final code hits.
Disclaimer aside, MSDN and TechNet subscribers got the updates ahead of time for server operating systems. Included with this release, a feature dubbed "Desktop Experience", providing the same interface as Windows 8.1.
Ars Technica played around with the updates and compiled several captioned screenshots into a carousel. Check them out. Of course, things could change, but this should be a better indication than -- for instance -- Windows 8 Developer Preview.
The final version of Windows 8.1 is expected to be released at some point later this year. The update will be free for current Windows 8 users.
Subject: General Tech, Cases and Cooling, Shows and Expos | June 22, 2013 - 01:43 PM | Scott Michaud
Tagged: noise cancellation, noctua, computex 2013, computex
Update (June 22-2013, 4:43pm EDT): I was contacted by Noctua about the TDP ratings... quoting from their email:
As for the question regarding the TDP rating of the original NH-D14, I'd like to stress that the cooler can *easily* handle any 130W CPU! Our D14 is renowned to be among the best performing heatsinks for overclocking on the market and and many users have pushed their CPUs well beyond 250W using this cooler.
Noctua apparently does not like including TDP values for their coolers because it varies heavily on the conditions (such as, of course, room and case temperature). It makes sense, of course, because then customers would go looking at reviews and see what overclocks were achieved with the system.
Yes, I know Computex is long over, but I missed something that I want to cover.
Noctua has been teasing active noise cancellation (ANC) for their CPU coolers for quite some time now; Tim published his brief thoughts, 13 months ago, on their press release leading up to Computex 2012. The prototype, this year, is a full unit rather than the fan from last year.
This design is a modified NH-D14 cooler with added technology from RotoSub AB to sample its own noise and destructively interfere. According to Noctua, this will be the first ANC cooling unit for a CPU. The plan, as their press release suggests, is to release a cooler with the model named "R-ANC" after its (R)otoSub (A)ctive (N)oise (C)ancellation (R-ANC) technology. To me, this seems like a confusing choice in name as it breaks away from their existing standard and limits choice in name for future models based on this technology. Personally, I would have preferred to see "NH-D14R" or "NH-D14ANC", but alas I am not a marketer.
Also, in the process of researching for this article, I have been unable to find a canonical TDP-rating for this device. I was not too surprised to have a difficult time finding it for this unreleased product, but TDP is even omitted from the established, albeit louder, default NH-D14. Some sources claim this cooler can support an Intel i7 Extreme processor, which typically requires a 130W thermal dissipation; other sources say you should be somewhat cautious with this cooler with CPUs >95W TDP; some even claim it is great for air-only overclocking. Rolling all of these sources together, assuming a kernel of truth in each, I would assume this cooler (and, by extension, its upcoming R-ANC variant) would be good for decent air-only overclocks until you reach the -E series.
But, grain of salt, have some.
No word of pricing, but Noctua believes they will have it available spring/summer of next year. For some reference, the default NH-D14 can be found for about $75-$100; expect the R-ANC to be slightly north of that.
Subject: Editorial, General Tech, Systems, Shows and Expos | June 17, 2013 - 03:16 AM | Scott Michaud
Tagged: xbox one, microsoft, ea, E3 13, E3
Update: Microsoft denies the statements from their support account... but this is still one of the major problems with DRM and closed platforms in general. It is stuff like this that you let them do.
Consumers, whether they acknowledge it or not, fear for the control that platform holders have over their content. It was hard for many to believe that having your EA account banned for whatever reason, even a dispute with a forum moderator, forfeited your license to games you play through that EA account. Sounds like another great idea for Microsoft to steal.
@dohertymark If your account is banned, you also forfeit the licenses to any games that have licenses tied to it as listed in the ToU. ^AC
— Xbox Support (@XboxSupport1) June 14, 2013
Not stopping there, later on in the thread they were asked what would happen in the event of a security breach. You know, recourse before destroying access to possibly thousands of dollars of content.
@KillerRamen Ensure your account security features are enabled, and security proofs details are correct. ^ML
— Xbox Support (@XboxSupport1) June 15, 2013
While not a "verified account", @xboxsupport is.
They acknowledge ownership of this account in the background image there.
Honestly, there shouldn't have been any doubt that these actually are Microsoft employees.
At this point, we have definitely surpassed absurdity. Sure, you typically need to do something fairly bad to have Microsoft stop charging your for Xbox Live. Removing access to your entire library of games, to me, is an attempt to limit cheating and the hardware community.
Great, encourage spite from the soldering irons, that works out well.
Don't worry, enthusiasts, you know the PC loves you.
Gaming as a form of entertainment is fundamentally different than gaming as a form of art. When content is entertainment, its message touches you without any intrinsic value and can be replaced with similar content. Sometimes a certain piece of content, itself, has specific value to society. It is these times where we should encourage efforts by organizations such as GoG, Mozilla and W3C, Khronos, and many others. Without help, it could be extremely difficult or impossible for content to be preserved for future generations and future civilizations.
It does not even need to get in the way of the industry and its attempt to profit from the gaming medium; a careless industry, on the other hand, can certainly get in the way of our ability to have genuine art. After all, this is the main reason why I am a PC gamer: the platform allows entertainment to co-exist with communities who support themselves when the official channels do not.
Of course, unless Windows learns a little something from the Xbox. I guess do not get your Windows Store account banned in the future?