All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | March 20, 2014 - 03:01 PM | Jeremy Hellstrom
Tagged: bitcoin, dogecoin, internet of things, cryptocurrency
Not content to ruin the hopes of gamers wanting to upgrade to a Hawaii based AMD GPU now your smart devices are being press ganged into mining crypto-currency. Everything from TVs and fridges through printers, routers and security cameras can be infected with the linux.darlloz worm and will then begin mining for the author of the worm. The worm will even block other infections and has even been monitored patching certain holes in routers to prevent anything else from infecting the device and slowing down the mining computations. The Inquirer does have some humourous news about this worm, there are 31,716 separate IP addresses infected but this has manged to raise a mere $196.00 so for the author.
"A WORM that leverages the Internet of Things to mine cryptocurrencies has been found to have infected around 31,000 devices."
Here is some more Tech News from around the web:
- MSI Shows Off New Gaming Notebooks @ Kitguru
- Exclusive Interview with Richard Huddy from Intel at GDC @ Kitguru
- Top 10 Google Chromecast apps you should install @ The Inquirer
- Azure promises to guard virtual machines against migration dangers @ The Register
- 'Software amplifier' boosts quantum signals @ The Register
- Intel Desktop Roadmap Update - Devil's Canyon, Broadwell @ Legit Reviews
- Intel Talks Haswell-E, Broadwell, Devil's Canyon & More @ Hardware Canucks
- Intel to renew commitment to desktop PCs with a slew of new CPUs
- We're being royalty screwed! Pandora blames price rise on musos wanting money @ The Register
Subject: General Tech, Graphics Cards | March 19, 2014 - 08:26 PM | Ryan Shrout
Tagged: live blog, gdc 14, dx12, DirectX 12, DirectX
UPDATE: If you are looking for the live blog information including commentary and photos, we have placed it in archive format right over here. Thanks!!
It is nearly time for Microsoft to reveal the secrets behind DirectX 12 and what it will offer PC gaming going forward. I will be in San Francisco for the session and will be live blogging from it as networking allows. We'll have a couple of other PC Perspective staffers chiming in as well, so it should be an interesting event for sure! We don't know how much detail Microsoft is going to get into, but we will all know soon.
Microsoft DirectX 12 Session Live Blog
Thursday, March 20th, 10am PDT
You can sign up for a reminder using the CoverItLive interface below or you can sign up for the PC Perspective Live mailing list. See you Thursday!
Subject: General Tech, Shows and Expos | March 19, 2014 - 08:15 PM | Scott Michaud
Tagged: unreal engine 4, gdc 14, GDC, epic games
Game developers, from indie to the gigantic, can now access Unreal Engine 4 with a $19/month subscription (plus 5% of revenue from resulting sales). This is a much different model from UDK, which was free to develop games with their precompiled builds until commercial release, where an upfront fee and 25% royalty is then applied. For Unreal Engine 4, however, this $19 monthly fee also gives you full C++ source code access (which I have wondered about since the announcement that Unrealscript no longer exists).
Of course, the Unreal Engine 3-based UDK is still available (and just recently updated).
This is definitely interesting and, I believe, a response to publishers doubling-down on developing their own engines. EA has basically sworn off engines outside of their own Frostbite and Ingite technologies. Ubisoft has only announced or released three games based on Unreal Engine since 2011; Activision has announced or released seven in that time, three of which were in that first year. Epic Games has always been very friendly to smaller developers and, with the rise of the internet, it is becoming much easier for indie developers to release content through Steam or even their own website. These developers now have a "AAA" engine, which I think almost anyone would agree that Unreal Engine 4 is, with an affordable license (and full source access).
Speaking of full source access, licensees can access the engine at Epic's GitHub. While a top-five publisher might hesitate to share fixes and patches, the army of smaller developers might share and share-alike. This could lead to Unreal Engine 4 acquiring its own features rapidly. Epic highlights their Oculus VR, Linux and Steam OS, and native HTML5 initiatives but, given community support, there could be pushes into unofficial support for Mantle, TrueAudio, or other technologies. Who knows?
A sister announcement, albeit a much smaller one, is that Unreal Engine 4 is now part of NVIDIA's GameWorks initiative. This integrates various NVIDIA SDKs, such as PhysX, into the engine. The press release quote from Tim Sweeney is as follows:
Epic developed Unreal Engine 4 on NVIDIA hardware, and it looks and runs best on GeForce.
Another brief mention is that Unreal Engine 4 will have expanded support for Android.
Subject: General Tech, Shows and Expos | March 19, 2014 - 05:00 PM | Scott Michaud
Tagged: Mantle, gdc 14, GDC, crytek, CRYENGINE
While I do not have too many details otherwise, Crytek and AMD have announced that mainline CRYENGINE will support the Mantle graphics API. CRYENGINE, by Crytek, now sits alongside Frostbite, by Dice, and Nitrous, by Oxide Games, as engines which support that alternative to DirectX and OpenGL. This comes little more than a week after their announcement of native Linux support with their popular engine.
The tape has separate draw calls!
Crytek has been "evaluating" the API for quite some time now, showing interest back at the AMD Developer Summit. Since then, they have apparently made a clear decision on it. It is also not the first time that CRYENGINE has been publicly introduced to Mantle, with Chris Robert's Star Citizen, also powered by the 4th Generation CRYENGINE, having announced support for the graphics API. Of course, there is a large gap between having a licensee do legwork to include an API and having the engine developer provide you supported builds (that would be like saying UnrealEngine 3 supports the original Wii).
Hopefully we will learn more as GDC continues.
Editor's (Ryan) Take:
As the week at GDC has gone on, AMD continues to push forward with Mantle and calls Crytek's implementation of the low level API "a huge endorsement" of the company's direction and vision for the future. Many, including myself, have considered that the pending announcement of DX12 would be a major set back for Mantle but AMD claims that is "short sited" and as more developers come into the Mantle ecosystem it is proof AMD is doing the "right thing."
Here at GDC, AMD told us they have expanded the number of beta Mantle members dramatically with plenty more applications (dozens) in waiting. Obviously this could put a lot of strain on AMD for Mantle support and maintenance but representatives assure us that the major work of building out documentation and development tools is nearly 100% behind them.
If stories like this one over at Semiaccurate are true, and that Microsoft's DirectX 12 will be nearly identical to AMD Mantle, then it makes sense that developers serious about new gaming engines can get a leg up on projects by learning Mantle today. Applying that knowledge to the DX12 API upon its release could speed up development and improve implementation efficiency. From what I am hearing from the few developers willing to even mention DX12, Mantle is much further along in its release (late beta) than DX12 is (early alpha).
AMD indeed was talking with and sharing the development of Mantle with Microsoft "every step of the way" and AMD has stated on several occasions that there were two outcomes with Mantle; it either becomes or inspires a new industry standard in game development. Even if DX12 is more or less a carbon copy of Mantle, forcing NVIDIA to implement that API style with DX12's release, AMD could potentially have the advantage of gaming performance and support between now and Microsoft's DirectX release. That could be as much as a full calendar year from reports we are getting at GDC.
Subject: General Tech | March 19, 2014 - 04:15 PM | Jeremy Hellstrom
Tagged: Homeworld, Homeworld 2, kick ass, Homeworld Remastered, gearbox
If you never had the chance to play Homeworld and Homeworld 2 then there is some good news you should hear; if you did play them then the news is even better! Gearbox is not just re-releasing the two games they have redone the cinematics, models, textures and effects, plus support for up to 4K resolution. The news at Rock, Paper SHOTGUN makes it sound like some content from mods might be included as well as revamped multiplayer. Maybe they will up the quality to the point where some later levels chug just as hard as they did when you first sat down in front of your mothership.
"We learned last year that Gearbox were planning to re-release the enormously loved Homeworld games. Having plucked the license from the THQ jumble sale, apparently because their Chief Creative Officer Brian Martel has maximum love for the series, they made clear their intentions to release an HD version of the first two games. They’re upping their terminology now, from “HD” to “Remastered“."
Here is some more Tech News from around the web:
- Nostalgiablivion: Morrowind's First Quests In Skywind @ Rock, Paper, SHOTGUN
- Frog Fractions 2′s Kickstarter Exists, Is Utterly Insane @ Rock, Paper, SHOTGUN
- Boooooo! - The Witcher 3 Delayed To 2015 @ Rock, Paper, SHOTGUN
- Titanfall @ The Inquirer
- AMD Mantle & TrueAudio Patch for THIEF Tested @ Legit Reviews
- Sony announces Project Morpheus virtual reality headset @ Hexus
Subject: General Tech, Shows and Expos | March 19, 2014 - 01:20 PM | Jeremy Hellstrom
Tagged: Imagination Technologies, gdc 14, wizard, ray tracing
The Tech Report visited Imagination Technologies' booth at GDC where they were showing off a new processor, the Wizard GPU. It is based on the PowerVR Series6XT Rogue graphics processor which is specifically designed to accelerate ray tracing performance, a topic we haven't heard much about lately. They describe the performance as capable of processing 300 million rays and 100 million dynamic triangles per second which translates to 7 to 10 rays per pixel at 720p and 30Hz or 3 to 5 rays a pixel at 1080p and 30Hz. That is not bad, though Imagination Technologies estimates movies display at a rate of 16 to 32 rays per pixel so it may be a while before we see a Ray Tracing slider under Advanced Graphics Options.
"When we visited Imagination Technologies at CES, they were showing off some intriguing hardware that augments their GPUs in order to accelerate ray-traced rendering. Ray tracing is a well-known and high-quality form of rendering that relies on the physical simulation of light rays bouncing around in a scene. Although it's been used in movies and in static scene creation, ray tracing has generally been too computationally intensive to be practical for real-time graphics and gaming. However, Imagination Tech is looking to bring ray-tracing to real-time graphics—in the mobile GPU space, no less—with its new family of Wizard GPUs."
Here is some more Tech News from around the web:
- MoOx contacts make p-type transistor @ Nanotechweb
- Surrender your crypto keys or you're off to chokey, says Australia @ The Register
- Win XP holdouts storm eBay and licence brokers, hiss: Give us all your Windows 7 @ The Register
- Ubuntu Now Runs Well On The MacBook Air, Beats OS X In Graphics @ Phoronix
- Hidden 'Windigo' UNIX ZOMBIES are EVERYWHERE @ The Register
- Xbox boss Marc Whitten leaves Microsoft for Sonos as PS4 leads console sales @ The Inquirer
- Big Brother China Censors WeChat... Again @ TechARP
- Ergotech Freedom Quad 1-over-3 Desk Stand Review @ Techgage
- 10 Old Sprint Phones Can Now Get Totally Free Voice, Texts, and Data @ Gizmodo
Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | March 19, 2014 - 09:03 AM | Scott Michaud
Tagged: WebCL, gdc 14, GDC
The Khronos Group has just ratified the standard for WebCL 1.0. The API is expected to provide a massive performance boost to web applications which are dominated by expensive functions which can be offloaded to parallel processors, such as GPUs and multi-core CPUs. Its definition also allows WebCL to communicate and share buffers between it and WebGL with an extension.
Frequent readers of the site might remember that I have a particular interest in WebCL. Based on OpenCL, it allows web apps to obtain a list of every available compute device and target it for workloads. I have personally executed tasks on an NVIDIA GeForce 670 discrete GPU and other jobs on my Intel HD 4000 iGPU, at the same time, using the WebCL prototype from Tomi Aarnio of Nokia Research. The same is true for users with multiple discrete GPUs installed in their system (even if they are not compatible with Crossfire, SLi, or are from different vendors altogether). This could be very useful for physics, AI, lighting, and other game middleware packages.
Still, browser adoption might be rocky for quite some time. Google, Mozilla, and Opera Software were each involved in the working draft. This leaves both Apple and Microsoft notably absent. Even then, I am not sure how much interest exists within Google, Mozilla, and Opera to take it from a specification to a working feature in their browsers. Some individuals have expressed more faith in WebGL compute shaders than WebCL.
Of course, that can change with just a single "killer app", library, or middleware.
I do expect some resistance from the platform holders, however. Even Google has been pushing back on OpenCL support in Android, in favor of their "Renderscript" abstraction. The performance of a graphics processor is also significant leverage for a native app. There is little, otherwise, that cannot be accomplished with Web standards except a web browser itself (and there is even some non-serious projects for that). If Microsoft can support WebGL, however, there is always hope.
The specification is available at the Khronos website.
Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | March 19, 2014 - 09:02 AM | Scott Michaud
Tagged: OpenGL ES, opengl, opencl, gdc 14, GDC, EGL
The Khronos Group has also released their ratified specification for EGL 1.5. This API is at the center of data and event management between other Khronos APIs. This version increases security, interoperability between APIs, and support for many operating systems, including Android and 64-bit Linux.
The headline on the list of changes is the move that EGLImage objects makes, from the realm of extension into EGL 1.5's core functionality, giving developers a reliable method of transferring textures and renderbuffers between graphics contexts and APIs. Second on the list is the increased security around creating a graphics context, primarily designed for WebGL applications which any arbitrary website can become. Further down the list is the EGLSync object which allows further partnership between OpenGL (and OpenGL ES) and OpenCL. The GPU may not need CPU involvement when scheduling between tasks on both APIs.
During the call, the representative also wanted to mention that developers have asked them to bring EGL back to Windows. While it has not happened yet, they have announced that it is a current target.
The EGL 1.5 spec is available at the Khronos website.
Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | March 19, 2014 - 09:01 AM | Scott Michaud
Tagged: SYCL, opencl, gdc 14, GDC
To gather community feedback, the provisional specification for SYCL 1.2 has been released by The Khronos Group. SYCL extends itself upon OpenCL with the C++11 standard. This technology is built on another Khronos platform, SPIR, which allows the OpenCL C programming language to be mapped onto LLVM, with its hundreds of compatible languages (and Khronos is careful to note that they intend for anyone to make their own compatible alternative langauge).
In short, SPIR allows many languages which can compile into LLVM to take advantage of OpenCL. SYCL is the specification for creating C++11 libraries and compilers through SPIR.
As stated earlier, Khronos wants anyone to make their own compatible language:
While SYCL is one possible solution for developers, the OpenCL group encourages innovation in programming models for heterogeneous systems, either by building on top of the SPIR™ low-level intermediate representation, leveraging C++ programming techniques through SYCL, using the open source CLU libraries for prototyping, or by developing their own techniques.
SYCL 1.2 supports OpenCL 1.2 and they intend to develop it alongside OpenCL. Future releases are expected to support the latest OpenCL 2.0 specification and keep up with future developments.
The SYCL 1.2 provisional spec is available at the Khronos website.
Subject: General Tech, Storage | March 18, 2014 - 06:58 PM | Ryan Shrout
Tagged: ssd, Samsung, ocz, Intel, corsair
Back in January I wrote a short editorial that asked the question: "Is now the time to buy an SSD?" At that time we were looking at a combination of price drops with a lack of upcoming hardware releases. Since that published we have seen the release of the Intel 730 Series SSDs and even the new Crucial M550. While those are interesting drives to be sure (review pending on the M550), they aren't changing our opinions on the currently available, and incredibly cheap, solid state options.
While looking for some new hardware for the office, I found that the 1TB Samsung 840 EVO is now at an all time low $469! That is one of the faster SSDs on the market, and one of Allyn's favorites, for $0.469/GB!! I have included an updated table below with some of the most popular SSDs and their prices.
|Samsung 840 EVO||120 GB||$0.69/GB||$83 - Amazon|
|250 GB||$0.55/GB||$139 - Amazon|
|500 GB||$0.51/GB||$259 - Amazon|
|750 GB||$0.51/GB||$388 - Amazon|
|1000 GB||$0.46/GB||$469 - Amazon|
|Samsung 840 Pro||128 GB||$0.92/GB||$119 - Amazon|
|256 GB||$0.77/GB||$199 - Amazon|
|512 GB||$0.74/GB||$413 - Amazon|
|Intel 530 Series||120 GB||$0.91/GB||$89 - Amazon|
|180 GB||$0.80/GB||$144 - Amazon|
|240 GB||$0.62/GB||$149 - Amazon|
|480 GB||$0.87/GB||$419 - Amazon|
|Crucial M500 Series||120 GB||$0.57/GB||$69 - Amazon|
|240 GB||$0.49/GB||$119 - Amazon|
|480 GB||$0.47/GB||$229 - Amazon|
|960 GB||$0.45/GB||$439 - Amazon|
The biggest price drops were seen in the higher capacity drives including, the Samsung 840 EVO 1TB and 750GB models, the Intel 530 Series 480GB drive and even the Crucial M500 960GB and 480GB drives. Numerically the best value is with the 960GB Crucial M500 drive at $0.45/GB but it is followed very closely by that 1TB Samsung 840 EVO.
As of now, the Intel 730 Series of SSDs is available for sale on Amazon.com but their price per GB comparisons don't really match that of the EVO or M500. They are great drives, just read Allyn's review to see the proof of that, but they are targeted at the very performance conscious. The Crucial M550 is brand new, and looks interesting; expect us to dive more into that line very soon.
For me personally, grabbing a 750GB SSD is incredibly enticing and I think I'll find a handful in my cart to update our older 180GB SSD test beds.
Subject: General Tech, Shows and Expos | March 18, 2014 - 12:38 PM | Jeremy Hellstrom
Tagged: gdc 14, amd, ocz, Vector 150
If you make it to the Game Developers Conference this year make sure to pay a visit to the AMD booth where you can get a look at OCZ's Vector 150 drives in action. They aim to show that these drives are not only good for the gamer, they are good for the game designer as well.
OCZ Vector 150 SSDs on Display at AMD Booth #1024, March 17-21 in San Francisco, CA
SAN JOSE, CA - March 17, 2014 - OCZ Storage Solutions - a Toshiba Group Company and leading provider of high-performance solid state drives (SSDs) for computing devices and systems, today announced its partnership with AMD to showcase the power of high performance technology at the Game Developer Conference (GDC) March 17-21 at the Moscone Center in San Francisco, CA. AMD's demo systems will feature best-in-class Vector 150 Series solid state drives demonstrating how developers can enhance productivity and efficiency in their work.
"We are excited to partner with AMD for the upcoming Game Developers Conference to support the fast growing interactive game development industry," said Alex Mei, CMO for OCZ Storage Solutions. "OCZ is dedicated to delivering premium solid state storage solutions that are not only a useful tool for developers, but also meet the unique demands of enthusiasts and gamers on all levels."
"Our presence at the 2014 Game Developer Conference will feature a number of high-performance gaming systems running 24/7 in harsh conditions," said Darren McPhee, director of product marketing, Graphics Business Unit, AMD. "We knew that OCZ Vector SSDs were uniquely ready to meet the reliability requirements of our gaming installations. Between the high performance graphics of AMD Radeon™ GPUs and the fast load times of OCZ Vector SSDs, visitors to AMD's booth in the South Hall are in for a great gaming experience!"
GDC is the world's largest game industry event, attracting over 23,000 professionals including programmers, artists, producers, designers, audio professionals, business decision-makers, and other digital gaming industry authorities. OCZ's premium Vector 150 Series, designed for workstation users along with bleeding-edge enthusiasts, will be in AMD systems that promote improved CPU and GPU performance, enhanced rendering, speed, and overall system performance. Professional developer applications demand peak transfer speeds and ultra-high performance; OCZ SSDs offer 100 times faster access to data, quicker boot ups, faster file transfers, and a more responsive computing experience than hard drives.
GDC enables OCZ to team up with valued industries partners like AMD to reaffirm the Company's commitment to the gaming segment, and promote the use of flash storage for both developers and the gamers themselves.
Subject: General Tech | March 17, 2014 - 02:33 PM | Jeremy Hellstrom
Tagged: Internet, icann, iana, ntia
The root DNS of the internet has been run by National Telecommunications and Information Administration, a division of of the US Government though in truth they contracted out this responsibility to the non-profit Internet Corporation for Assigned Names and Numbers and their Internet Assigned Numbers Authority. This will not really change much in the way that the root DNS has handled but is more a handover of official responsibility as the US Government hands off it's past role as overseer of the internet. If ICANN and IANA are unfamiliar to you then pop over to The Register for a very brief overview of what they actually do.
"The US Department of Commerce is ready to leave the keys to the internet's worldwide DNS system in the hands of non-profit net overseer ICANN."
Here is some more Tech News from around the web:
- What could you do with 264 Sandy Bridge cores and 6TB of RAM? @ The Register
- Is no browser safe? Security bods poke holes in Chrome, Safari, IE, Firefox and earn $1m @ The Register
- Digitimes Research: Insufficient panel supplies to limit 2K-resolution smartphone penetration in 2014
- Origami Microscope for Just 50 Cents @ MAKE:Blog
- Intel to increase focus on desktop computers @ Kitguru
- CeBIT 2014: A Quick Tour @ Madshrimps
- CeBIT 2014: Fractal Shows off Node 804 and Watercooling Solution @ Madshrimps
- Office 2013 Crashes on Start Up - Solved - PCSTATS TechTip
Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | March 17, 2014 - 09:01 AM | Scott Michaud
Tagged: OpenGL ES, opengl, Khronos, gdc 14, GDC
Today, day one of Game Developers Conference 2014, the Khronos Group has officially released the 3.1 specification for OpenGL ES. The main new feature, brought over from OpenGL 4, is the addition of compute shaders. This opens GPGPU functionality to mobile and embedded devices for applications developed in OpenGL ES, especially if the developer does not want to add OpenCL.
The update is backward-compatible with OpenGL ES 2.0 and 3.0 applications, allowing developers to add features, as available, for their existing apps. On the device side, most functionality is expected to be a driver update (in the majority of cases).
OpenGL ES, standing for OpenGL for Embedded Systems but is rarely branded as such, delivers what they consider the most important features from the graphics library to the majority of devices. The Khronos Group has been working toward merging ES with the "full" graphics library over time. The last release, OpenGL ES 3.0, was focused on becoming a direct subset of OpenGL 4.3. This release expands upon the feature-space it occupies.
OpenGL ES also forms the basis for WebGL. The current draft of WebGL 2.0 uses OpenGL ES 3.0 although that was not discussed today. I have heard murmurs (not from Khronos) about some parties pushing for compute shaders in that specification, which this announcement puts us closer to.
The new specification also adds other features, such as the ability to issue a draw without CPU intervention. You could imagine a particle simulation, for instance, that wants to draw the result after its compute shader terminates. Shading is also less rigid, where vertex and fragment shaders do not need to be explicitly linked into a program before they are used. I inquired about the possibility that compute devices could be targetted (for devices with two GPUs) and possibly load balanced, in a similar method to WebCL but no confirmation or denial was provided (although he did mention that it would be interesting for apps that fall somewhere in the middle of OpenGL ES and OpenCL).
The OpenGL ES 3.1 spec is available at the Khronos website.
Subject: General Tech | March 16, 2014 - 10:27 PM | Sebastian Peak
Tagged: supercomputer, solid state drive, NSF, flash memory
We know that SSD's help any system perform better by reducing the storage bottlenecks we all experienced from hard disk drives. But how far can flash storage go in increasing performance if money is no object?? Enter the multi-million dollar world of supercomputers. Historically supercomputers have relied on the addition of more CPU cores to increase performance, but two new system projects funded by the National Science Foundation (NSF) will try a different approach: obscene amounts of high-speed flash storage!
The news comes as the NSF is requesting a cool $7 billion in research money for 2015, and construction has apparently already begun on two new storage-centered supercomputers. Memory and high-speed flash storage arrays will be loaded on the Wrangler supercomputer at Texas Advanced Computing Center (TACC), and the Comet supercomputer at the San Diego Supercomputer Center (SDSC).
Check out the crazy numbers from the TACC's Wrangler: a combination of 120 servers, each with Haswell-based Xeon CPU's, and a total of 10 petabytes (10,000TB!) of high performance flash data storage. The NSF says the supercomputer will have 3,000 processing cores dedicated to data analysis, with flash storage layers for analytics. The Wrangler supercomputer's bandwidth is said to be 1TB/s, with 275 million IOPS! By comparison, the Comet supercomputer will have “only” 1,024 Xeon CPU cores, with a 7 petabyte high-speed flash storage array. (Come on, guys... That’s like, wayyy less bytes.)
Supercomputer under construction…probably (Image credit CBS/Paramount)
The supercomputers are part of the NSF's “Extreme Digital” (XD) research program, and their current priorities are "relevant to the problems faced in computing today”. Hmm, kind of makes you want to run a big muilti-SSD deathwish RAID, huh?
Subject: General Tech | March 16, 2014 - 05:48 PM | Ryan Shrout
Tagged: march madness, contest
Well, it was a heart breaking game for my University of Kentucky Wildcats, but I'm looking forward to the NCAA Men's College Basketball Tournament all the more! In the past we have run PC Perspective bracket competitions and I had a couple of requests to do the same for 2014.
If you are interested in joining the gang at PC Perspective for this years March Madness bracket challenge all you have to do is visit our CBSSports.com page and register an account. You can then make your selections and see how well you predict the college basketball landscape for the tournament.
- Visit http://www.cbssports.com/fantasy/universal
- Select "Mayhem" as the sport
- Input "pcper2014" as the league abbreviation
- Password: "gobigblue"
- Sign up for a spot!
- Pick your bracket after the selection show tonight!
- Win some stuff!
For the winners we are going to offer up some extra hardware we have here at the office BUT I haven't put together the list just yet. I'm going to be asking the other editors what we have not being used and I'll update this post as soon as we come up with a final answer.
For now though, sign up, have fun and good luck!!
(Yes, you can enter from ANYWHERE in the world for this!)
Subject: Editorial, General Tech | March 16, 2014 - 03:27 AM | Scott Michaud
Tagged: windows, mozilla, microsoft, Metro
If you use the Firefox browser on a PC, you are probably using its "Desktop" application. They also had a version for "Modern" Windows 8.x that could be used from the Start Screen. You probably did not use it because fewer than 1000 people per day did. This is more than four orders of magnitude smaller than the number of users for Desktop's pre-release builds.
Yup, less than one-thousandth.
Jonathan Nightingale, VP of Firefox, stated that Mozilla would not be willing to release the product without committing to its future development and support. There was not enough interest to take on that burden and it was not forecast to have a big uptake in adoption, either.
From what we can see, it's pretty flat.
Paul Thurrott of WinSupersite does not blame Mozilla for killing "Metro" Firefox. He acknowledges that they gave it a shot and did not see enough pre-release interest to warrant a product. He places some of the blame on Microsoft for the limitations it places on browsers (especially on Windows RT). In my opinion, this is just a symptom of the larger problem of Windows post-7. Hopefully, Microsoft can correct these problems and do so in a way that benefits their users (and society as a whole).
Subject: General Tech, Cases and Cooling, Shows and Expos | March 15, 2014 - 01:44 AM | Scott Michaud
Tagged: GDC, gdc 14, valve, Steam Controller
Two months ago, Valve presented a new prototype of their Steam Controller with a significantly changed button layout. While the overall shape and two thumbpads remained constant, the touchscreen disappeared and the face buttons more closely resembled something from an Xbox or PlayStation. Another prototype image has been released, ahead of GDC, without many changes.
Valve is still in the iteration process for its controller, however. Ten controllers will be available at GDC, each handmade. This version has been tested internally for some undisclosed amount of time, but this will be the first time that others will give their feedback since the design that was shown at CES. The big unknown is: to what level are they going to respond to feedback? Are we at the stage where it is about button sizing? Or, will it change radically - like to a two-slice toaster case with buttons inside the slots.
GDC is taking place March 17th through the 21st. The expo floor opens on the 19th.
Subject: General Tech | March 14, 2014 - 03:01 PM | Jeremy Hellstrom
Tagged: microsoft, office, office 365, tablet
The newest member of Microsoft's cloudy version of the world's most common productivity software is called Office 365 Personal and it will provide a single license which can be used on a PC or Mac and one tablet. The subscription will cost less than the current Office 365 Home Premium which allowed up to five devices access but only offered a version of Office dubbed Office Mobile for tablets and phones. This will not be the watered down version of Office that ships with WinRT on Surface and while The Register was provided some hints on what the new software will look like we won't be seeing any demos until closer to the launch which will take place this Spring.
"Microsoft will soon debut a new formulation of its Office 365 subscription service aimed at individual consumers, the company said on Thursday, and in the process it hinted that new, touch-centric Office apps may be coming soon."
Here is some more Tech News from around the web:
- Google starts encrypting search data to protect users from NSA snooping @ The Inquirer
- Microsoft gives away Windows Phone 8 licences in India – report @ The Register
- HTC One 2 release date, specs, rumours and price @ The Inquirer
- VLC Player beta arrives for Windows 8 @ The Inquirer
- The real story behind Twin Galaxies @ Kitguru
Subject: General Tech | March 13, 2014 - 02:58 PM | Ken Addison
Tagged: podcast, video, evga, gtx 780, 780 ACX, acx, titanfall, 750 ti, nvidia, 800m, 860m, razer
PC Perspective Podcast #291 - 03/13/2014
Join us this week as we discuss the EVGA GTX 780 ACX, Building a PC for Titanfall, NVIDIA's 800m GPU Lineup and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Josh Walrath, Allyn Malventano, and Morry Teitelman
Week in Review:
0:41:35 This podcast is brought to you by Coolermaster, and the CM Storm Pulse-R Gaming Headset
News items of interest:
Hardware/Software Picks of the Week:
Allyn: Leaving in the middle of podcasts
Subject: General Tech | March 13, 2014 - 02:31 PM | Jeremy Hellstrom
Tagged: usb, charger, DIY
Over at Hack a Day is a guide on how to convert your old chargers for devices you no longer use into a useful charger with a USB plug. They will need to be of the 5V variety and provide at least 500 mA in order to be useful with today's gadgets with 1A being preferable; don't go so high you are at risk of killing your device though. Apple fanatics will have to do the usual modifications to convince their iThing to accept a charge but most other devices won't care if the charger is home made or not, they just want the USB. Do try not to set yourself or any of your possessions on fire by not testing your charger thoroughly before leaving it unattended.
"If you’re like us, you probably have a box (or more) of wall warts lurking in a closet or on a shelf somewhere. Depending on how long you’ve been collecting cell phones, that box is likely overflowing with 5V chargers: all with different connectors."
Here is some more Tech News from around the web:
- BB10's 'dated' crypto lets snoops squeeze the juice from your BlackBerry – researcher @ The Register
- Replicant OS Developers Find Backdoor In Samsung Galaxy Devices @ Slashdot
- Hackers can steal Whatsapp conversations due to Android security flaw @ The Inquirer
- How to Use the Super Fast i3 Tiling Window Manager on Linux @ Linux.com
- Seven Great Moments in World Wide Web History @ The Inquirer
- How to shop wisely for the IT department of the future @ The Tech Report
- Projector on a smartphone? There's a chip for that @ The Register
- Make an HD Projector for Next to Nothing! @ Hack a Day
- Win a Powerful ASUS R9 290 Graphics Card @ Kitguru
Get notified when we go live!