All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Optical + Accelerometer
When I met with Logitech while setting up for our Hardware Workshop at Quakecon this year, they wanted to show me a new mouse they were coming out with. Of course I was interested, but to be honest, mice have seemingly gone to a point where I could very rarely tell them apart in terms of performance. Logitech promised me this would be different. The catch? The G402 Hyperion Fury includes not just an optical sensor but an accelerometer and gyro combo.
Pretty much all mice today use optical sensors to generate data. The sensors are, basically, taking hundreds or thousands of photos of the surface of your desk or mouse and compare them to each other to measure how far and how fast you have moved your mouse. Your PC then takes that data from the mouse at a USB polling rate, up to 1000 Hz with this mouse, and translates it into mouse movement on your desktop and in games.
There is an issue though - at very high speeds of mouse movement, the optical sensor can fail. It essentially loses track of where it is on the surface and can no longer provide accurate data back to the system. At this point, depending on the design of the mouse and driver, the mouse may just stop sending data all together or just attempt to "guess" for a short period of time. Clearly that's not ideal and means that gamers (or any user for that matter) is getting inaccurate measurements. Boo.
To be quite honest though, that doesn't happen with modern mice at your standard speeds, or even standard "fast" gaming motions. According to Logitech, the optical sensor will start to lose tracking somewhere in the 150-180 IPS, or inches per second. That's quite a lot. More precisely that is 3.8 meters per second or 8.5 miles per hour.
Introduction, Hardware, and Subjective Feel
This review comes before the end of the pre-order period. The reason why I targeted that deadline is because the pre-order perks are quite significant. First, either version of the mouse is listed for about $50 off of its MSRP (which is half price for the plastic version). EVGA also throws in a mouse pad for registering your purchase. The plastic mouse is $49.99 during its pre-order period ($99.99 MSRP) and its carbon fiber alternative is $79.99 ($129.99 MSRP). EVGA has supplied us with the plastic version for review.
Being left-handed really puts a damper on my choice of gaming mice. If the peripheral is designed to contain thumb buttons, it needs to either be symmetric (because a right hand's thumb buttons would be controlled by my pinky or ring finger) or be an ergonomic, curved mouse which comes in a special version for lefties that is mirrored horizontally (which is an obvious risk, especially when the market of left-handed gamers is further split by those who learned to force themselves to use right-handed mice).
Upgrades from Anker
Last year we started to have a large amount of mobile devices around the office including smartphones, tablets and even convertibles like the ASUS T100, all of which were charged with USB connections. While not a hassle when you are charging one or two units at time, having 6+ on our desks on any day started to become a problem for our less numerous wall outlets. Our solution last year was Anker's E150 25 watt wall charger that we did a short video overview on.
It was great but had limitations including different charging rates depending on the port you connected it to, limited output of 5 Amps total for all five ports and fixed outputs per port. Today we are taking a look at a pair of new Anker devices that implement smart ports called PowerIQ that enable the battery and wall charger to send as much power to the charging device as it requests, regardless of what physical port it is attached to.
We'll start with the updated Anker 40 watt 5-port wall charger and then move on to discuss the 3-port mobile battery charger, both of which share the PowerIQ feature.
Anker 40 watt 5-Port Wall Charger
The new Anker 5-port wall charger is actually smaller than the previous generation but offers superior specifications at all feature points. This unit can push out more than 40 watts total combined through all five USB ports, 5 volts at as much as 8 amps. All 8 amps can in fact go through a single USB charging port we are told if there was a device that would request that much - we don't have anything going above 2.3A it seems in our offices.
Any USB port can be used for any device on this new model, it doesn't matter where it plugs in. This great simplifies things from a user experience point of view as you don't have to hold the unit up to your face to read the tiny text that existed on the E150. With 8 amps spread across all five ports you should have more than enough power to charge all your devices at full speed. If you happen to have five iPads charging at the same time, that would exceed 8A and all the devices charge rates would be a bit lower.
AM1 Walks New Ground
After Josh's initial review of the AMD AM1 Platform and the Athlon 5350, we received a few requests to look at gaming performance with a discrete GPU installed. Even though this platform isn't being aimed at gamers looking to play demanding titles, we started to investigate this setup anyway.
While Josh liked the ASUS AM1I-A Mini ITX motherboard he used in his review, with only a x1 PCI-E slot it would be less than ideal for this situation.
Luckily we had the Gigabyte AM1M-S2H Micro ATX motherboard, which features a full length PCI-E x16 slot, as well as 2 x1 slots.
Don't be mistaken by the shape of the slot though, the AM1 chipset still only offers 4 lanes of PCI-Express 2.0. This, of course, means that the graphics card will not be running at full bandwidth. However, having the physical x16 slot makes it a lot easier to physically connect a discrete GPU, without having to worry about those ribbon cables that miners use.
Athlon and Pentium Live On
Over the past year or so, we have taken a look at a few budget gaming builds here at PC Perspective. One of our objectives with these build guides was to show people that PC gaming can be cost competitive with console gaming, and at a much higher quality.
However, we haven't stopped pursuing our goal of the perfect inexpensive gaming PC, which is still capable of maxing out image quality settings on today's top games at 1080p.
Today we take a look at two new systems, featuring some parts which have been suggested to us after our previous articles.
|AMD System||Intel System|
|Processor||AMD Athlon X4 760K - $85||Intel Pentium G3220 - $65|
|Cores / Threads||4 / 4||2 / 2|
|Motherboard||Gigabyte F2A55M-HD2 - $60||ASUS H81M-E - $60|
|Graphics||MSI R9 270 Gaming - $180||MSI R9 270 Gaming - $180|
|System Memory||Corsair 8GB DDR3-1600 (1x8GB) - $73||Corsair 8GB DDR3-1600 (1x8GB) - $73|
|Hard Drive||Western Digital 1TB Caviar Green - $60||Western Digital 1TB Caviar Green - $60|
|Power Supply||Cooler Master GX 450W - $50||Cooler Master GX 450W - $50|
|Case||Cooler Master N200 MicroATX - $50||Cooler Master N200 MicroATX - $50|
(Editor's note: If you don't already have a copy of Windows, and don't plan on using Linux or SteamOS, you'll need an OEM copy of Windows 8.1 - currently selling for $98.)
These are low prices for a gaming computer, and feature some parts which many of you might not know a lot about. Let's take a deeper look at the two different platforms which we built upon.
First up is the AMD Athlon X4 760K. While you may not have known the Athlon brand was still being used on current parts, they represent an interesting part of the market. On the FM2 socket, the 760K is essentially a high end Richland APU, with the graphics portion of the chip disabled.
What this means is that if you are going to pair your processor with a discrete GPU anyway, you can skip paying extra for the integrated GPU.
As for the motherboard, we went for an ultra inexpensive A55 option from Gigabyte, the GA-F2A55M-HD2. This board features the A55 chipset which launched with the Llano APUs in 2011. Because of this older chipset, the board does not feature USB 3.0 or SATA 6G capability, but since we are only concerned about gaming performance here, it makes a great bare bones option.
1920x1080, 2560x1440, 3840x2160
Join us on March 11th at 9pm ET / 6pm PT for a LIVE Titanfall Game Stream! You can find us at http://www.pcper.com/live. You can subscribe to our mailing list to be alerted whenever we have a live event!!
We canceled the event due to the instability of Titanfall servers. We'll reschedule soon!!
With the release of Respawn's Titanfall upon us, many potential PC gamers are going to be looking for suggestions on compiling a list of parts targeted at a perfect Titanfall experience. The good news is, even with a fairly low investment in PC hardware, gamers will find that the PC version of this title is definitely the premiere way to play as the compute power of the Xbox One just can't compete.
In this story we'll present three different build suggestions, each addressing a different target resolution but also better image quality settings than the Xbox One can offer. We have options for 1080p, the best option that the Xbox could offer, 2560x1440 and even 3840x2160, better known as 4K. In truth, the graphics horsepower required by Titanfall isn't overly extreme, and thus an entire PC build coming in under $800, including a full copy of Windows 8.1, is easy to accomplish.
Target 1: 1920x1080
First up is old reliable, the 1920x1080 resolution that most gamers still have on their primary gaming display. That could be a home theater style PC hooked up to a TV or monitors in sizes up to 27-in. Here is our build suggestion, followed by our explanations.
|Titanfall 1080p Build|
|Processor||Intel Core i3-4330 - $137|
|Motherboard||MSI H87-G43 - $96|
|Memory||Corsair Vengeance LP 8GB 1600 MHz (2 x 4GB) - $89|
|Graphics Card||EVGA GeForce GTX 750 Ti - $179|
|Storage||Western Digital Blue 1TB - $59|
|Case||Corsair 200R ATX Mid Tower Case - $72|
|Power Supply||Corsair CX 500 watt - $49|
|OS||Windows 8.1 OEM - $96|
|Total Price||$781 - Amazon Full Cart|
Our first build comes in at $781 and includes some incredibly competent gaming hardware for that price. The Intel Core i3-4330 is a dual-core, HyperThreaded processor that provides more than enough capability to push Titanfall any all other major PC games on the market. The MSI H87 motherboard lacks some of the advanced features of the Z87 platform but does the job at a lower cost. 8GB of Corsair memory, though not running at a high clock speed, provides more than enough capacity for all the programs and applications you could want to run.
What Mantle signifies about GPU architectures
Mantle is a very interesting concept. From the various keynote speeches, it sounds like the API is being designed to address the current state (and trajectory) of graphics processors. GPUs are generalized and highly parallel computation devices which are assisted by a little bit of specialized silicon, when appropriate. The vendors have even settled on standards, such as IEEE-754 floating point decimal numbers, which means that the driver has much less reason to shield developers from the underlying architectures.
Still, Mantle is currently a private technology for an unknown number of developers. Without a public SDK, or anything beyond the half-dozen keynotes, we can only speculate on its specific attributes. I, for one, have technical questions and hunches which linger unanswered or unconfirmed, probably until the API is suitable for public development.
Or, until we just... ask AMD.
Our response came from Guennadi Riguer, the chief architect for Mantle. In it, he discusses the API's usage as a computation language, the future of the rendering pipeline, and whether there will be a day where Crossfire-like benefits can occur by leaving an older Mantle-capable GPU in your system when purchasing a new, also Mantle-supporting one.
Q: Mantle's shading language is said to be compatible with HLSL. How will optimizations made for DirectX, such as tweaks during shader compilation, carry over to Mantle? How much tuning will (and will not) be shared between the two APIs?
[Guennadi] The current Mantle solution relies on the same shader generation path games the DirectX uses and includes an open-source component for translating DirectX shaders to Mantle accepted intermediate language (IL). This enables developers to quickly develop Mantle code path without any changes to the shaders. This was one of the strongest requests we got from our ISV partners when we were developing Mantle.
Follow-Up: What does this mean, specifically, in terms of driver optimizations? Would AMD, or anyone else who supports Mantle, be able to re-use the effort they spent on tuning their shader compilers (and so forth) for DirectX?
[Guennadi] With the current shader compilation strategy in Mantle, the developers can directly leverage DirectX shader optimization efforts in Mantle. They would use the same front-end HLSL compiler for DX and Mantle, and inside of the DX and Mantle drivers we share the shader compiler that generates the shader code our hardware understands.
A Hard Decision
Welcome to our second annual (only chumps say first annual... crap) Best Hardware of the Year awards. This is where we argue the order of candidates in several categories on the podcast and, some time later, compile the results into an article. The majority of these select the best hardware of its grouping but some look at the more general trends of our industry.
As an aside, Google Monocle will win Best Hardware Ever 2014, 2015, and 2017. It will fail to be the best of all time for 2016, however.
If you would like to see the discussion as it unfolded then you should definitely watch Episode 282 recorded January 2nd, 2014. You do not even need to navigate away because we left it tantalizingly embed below this paragraph. You know you want to enrich the next two hours of your life. Click it. Click it a few times if you have click to enable plugins active in your browser. You can stop clicking when you see the polygons dance. You will know it when you see it.
The categories were arranged as follows:
- Best Graphics Card of 2013
- Best CPU of 2013
- Best Storage of 2013
- Best Case of 2013
- Best Motherboard of 2013
- Best Price Drop of 2013
- Best Mobile Device of 2013
- Best Trend of 2013
- Worst Trend of 2013
Each of the winners will be given our "Editor's Choice" award regardless of its actual badge in any review we conducted of it. This is because the product is the choice of our editors for this year even if it is not an "Editor's Choice". It may have not even been reviewed by us at all.
Also, the criteria for winning each category is left as vague as possible for maximum interpretation.
Introduction and Design
We’re always on the hunt for good docking stations, and sometimes it can be difficult to locate one when you aren’t afforded the luxury of a dedicated docking port. Fortunately, with the advent of USB 3.0 and the greatly improved bandwidth that comes along with it, the options have become considerably more robust.
Today, we’ll take a look at StarTech’s USB3SDOCKHDV, more specifically labeled the Universal USB 3.0 Laptop Docking Station - Dual Video HDMI DVI VGA with Audio and Ethernet (whew). This docking station carries an MSRP of $155 (currently selling for $123 on Amazon.com) and is well above other StarTech options (such as the $100 USBVGADOCK2, which offers just one video output—VGA—10/100 Ethernet, and four USB 2.0 ports). In terms of street price, it is currently available at resellers such as Amazon for around $125.
The big selling points of the USB3SDOCKHDV are its addition of three USB 3.0 ports and Gigabit Ethernet—but most enticingly, its purported ability to provide three total screens simultaneously (including the connected laptop’s LCD) by way of dual HD video output. This video output can be achieved by way of either HDMI + DVI-D or HDMI + VGA combinations (but not by VGA + DVI-D). We’ll be interested to see how well this functionality works, as well as what sort of toll it takes on the CPU of the connected machine.
Continue reading our review of the StarTech USB3SDOCKHDV USB 3.0 Docking Station!!!
A not-so-simple set of instructions
Valve released to the world the first beta of SteamOS, a Linux-based operating system built specifically for PC gaming, on Friday evening. We have spent quite a lot of time discussing and debating the merits of SteamOS, but this weekend we wanted to do an installation of the new OS on a system and see how it all worked.
Our full video tutorial of installing and configuring SteamOS
First up was selecting the hardware for the build. As is usually the case, we had a nearly-complete system sitting around that needed some tweaks. Here is a quick list of the hardware we used, with a discussion about WHY just below.
|Processor||Intel Core i5-4670K - $222|
|Motherboard||EVGA Z87 Stinger Mini ITX Motherboard - $257|
|Memory||Corsair Vengeance LP 8GB 1866 MHz (2 x 4GB) - $109|
|Graphics Card||NVIDIA GeForce GTX TITAN 6GB - $999
EVGA GeForce GTX 770 2GB SuperClocked - $349
|Storage||Samsung 840 EVO Series 250GB SSD - $168|
|Case||EVGA Hadron Mini ITX Case - $189|
|Power Supply||Included with Case|
|Optical Drive||Slot loading DVD Burnder - $36|
|Peak Compute||4,494 GFLOPS (TITAN), 3,213 GFLOPS (GTX 770)|
|Total Price||$1947 (GTX TITAN) $1297 (GTX 770)|
We definitely weren't targeting a low cost build with this system, but I think we did create a very powerful system to test SteamOS on. First up was the case, the new EVGA Hadron Mini ITX chassis. It's small, which is great for integration into your living room, yet can still hold a full power, full-size graphics card.
The motherboard we used was the EVGA Z87 Stinger Mini ITX - an offering that Morry just recently reviewed and recommended. Supporting the latest Intel Haswell processors, the Stinger includes great overclocking options and a great feature set that won't leave enthusiasts longing for a larger motherboard.
PC Component Selections
It's that time of year again! When those of us lucky enough with the ability, get and share the best in technology with our friends and family. You are already the family IT manager so why not help spread the holiday cheer by picking up some items for them, and hey...maybe for you. :)
This year we are going to break up the guide into categories. We'll have a page dedicated to PC components, one for mobile devices like notebooks and tablets and one for PC accessories. Then, after those specific categories, we'll have an open ended collection of pages where each PC Perspective team member can throw in some wildcards.
Our Amazon code is: pcper04-20
Intel Core i7-4770K Haswell Processor
The Intel Core i7-4770K is likely the best deal in computing performance today, after to power just about any configuration of PC you can think of without breaking much a sweat. You want to game? This part has you covered? You want to encode some video? The four cores and included HyperThreading support provide just about as much power as you could need. Yes there are faster processors in the form of the the Ivy Bridge-E and even 10+ core Xeon processors, but those are significantly more expensive. For a modest price of $299 you can get what is generally considered the "best" processor on the market.
Corsair Carbide Series Air 540 Case
Cases are generally considered a PC component that is more about the preference of the buyer but there are still fundamentals that make cases good, solid cases. The new Corsair Carbide Air 540 is unique in a lot of ways. The square-ish shape allows for a division of your power supply, hard drives and SSDs from the other motherboard-attached components. Even though the case is a bit shorter than others on the market, there is plenty of working room inside thanks to the Corsair dual-chamber setup and it even includes a pair of high-performance Corsair AF140L fans for intake and exhaust. The side panel window is HUGE allowing you to show off your goods and nice touches like the rubber grommeted cable routing cut outs and dust filters make this one of the best mid-range cases available.
Does downloading make a difference?
I posted a story earlier this week that looked at the performance of the new PS4 when used with three different 2.5-in storage options: the stock 500GB hard drive, a 1TB hybrid SSHD and a 240GB SSD. The results were fairly interesting (and got a good bit of attention) but some readers wanted more data. In particular, many asked how things might change if you went the full digital route and purchased games straight from the Sony's PlayStation Network. I also will compare boot times for each of the tested storage devices.
You should definitely check out the previous article if you missed it. It not only goes through the performance comparison but also details how to change the hard drive on the PS4 from the physical procedure to the software steps necessary. The article also details the options we selected for our benchmarking.
- HGST 500GB 5400 RPM HDD - $50 - $0.10/GB
- Seagate 1TB Hybrid SSHD - $122 - $0.12/GB
- Corsair 240GB Force GS SSD - $189 - $0.78/GB
Today I purchased a copy of Assassin's Creed IV from the PSN store (you're welcome Ubisoft) and got to testing. The process was the same: start the game then load the first save spot. Again, each test was run three times and the averages were reported. The PS4 was restarted between each run.
The top section of results is the same that was presented earlier - average load times for AC IV when the game is installed from the Blu-ray. The second set is new and includes average load times fro AC IV after the installation from the PlayStation Network; no disc was in the drive during testing.
Load time improvements
On Friday Sony released the PlayStation 4 onto the world. The first new console launch in 7 years, the PS4 has a lot to live up to, but our story today isn't going to attempt to weigh the value of the hardware or software ecosystem. Instead, after our PS4 teardown video from last week, we got quite a few requests for information on storage performance with the PS4 and what replacement hardware might offer gamers.
Hard Drive Replacement Process
Changing the hard drive in your PlayStation 4 is quite simple, a continuation of a policy Sony's policy with the PS3.
Installation starts with the one semi-transparent panel on the top of the unit, to the left of the light bar. Obviously make sure your PS4 is completely turned off and unplugged.
Simply slide it to the outside of the chassis and wiggle it up to release. There are no screws or anything to deal with yet.
Once inside you'll find a screw with the PS4 shapes logos on them; that is screw you need to remove to pull out the hard drive cage.
ShadowPlay is NVIDIA's latest addition to their GeForce Experience platform. This feature allows their GPUs, starting with Kepler, to record game footage either locally or stream it online through Twitch.tv (in a later update). It requires Kepler GPUs because it is accelerated by that hardware. The goal is to constantly record game footage without any noticeable impact to performance; that way, the player can keep it running forever and have the opportunity to save moments after they happen.
Also, it is free.
I know that I have several gaming memories which come unannounced and leave undocumented. A solution like this is very exciting to me. Of course a feature on paper not the same as functional software in the real world. Thankfully, at least in my limited usage, ShadowPlay mostly lives up to its claims. I do not feel its impact on gaming performance. I am comfortable leaving it on at all times. There are issues, however, that I will get to soon.
This first impression is based on my main system running the 331.65 (Beta) GeForce drivers recommended for ShadowPlay.
- Intel Core i7-3770, 3.4 GHz
- NVIDIA GeForce GTX 670
- 16 GB DDR3 RAM
- Windows 7 Professional
- 1920 x 1080 @ 120Hz.
- 3 TB USB3.0 HDD (~50MB/s file clone).
The two games tested are Starcraft II: Heart of the Swarm and Battlefield 3.
A new generation of Software Rendering Engines.
We have been busy with side projects, here at PC Perspective, over the last year. Ryan has nearly broken his back rating the frames. Ken, along with running the video equipment and "getting an education", developed a hardware switching device for Wirecase and XSplit.
My project, "Perpetual Motion Engine", has been researching and developing a GPU-accelerated software rendering engine. Now, to be clear, this is just in very early development for the moment. The point is not to draw beautiful scenes. Not yet. The point is to show what OpenGL and DirectX does and what limits are removed when you do the math directly.
Errata: BioShock uses a modified Unreal Engine 2.5, not 3.
In the above video:
- I show the problems with graphics APIs such as DirectX and OpenGL.
- I talk about what those APIs attempt to solve, finding color values for your monitor.
- I discuss the advantages of boiling graphics problems down to general mathematics.
- Finally, I prove the advantages of boiling graphics problems down to general mathematics.
I would recommend watching the video, first, before moving forward with the rest of the editorial. A few parts need to be seen for better understanding.
If Microsoft was left to their own devices...
Microsoft's Financial Analyst Meeting 2013 set the stage, literally, for Steve Ballmer's last annual keynote to investors. The speech promoted Microsoft, its potential, and its unique position in the industry. He proclaims, firmly, their desire to be a devices and services company.
The explanation, however, does not befit either industry.
Ballmer noted, early in the keynote, how Bing is the only notable competitor to Google Search. He wanted to make it clear, to investors, that Microsoft needs to remain in the search business to challenge Google. The implication is that Microsoft can fill the cracks where Google does not, or even cannot, and establish a business from that foothold. I agree. Proprietary products (which are not inherently bad by the way), as Google Search is, require one or more rivals to fill the overlooked or under-served niches. A legitimate business can be established from that basis.
It is the following, similar, statement which troubles me.
Ballmer later mentioned, along the same vein, how Microsoft is among the few making fundamental operating system investments. Like search, the implication is that operating systems are proprietary products which must compete against one another. This, albeit subtly, does not match their vision as a devices and services company. The point of a proprietary platform is to own the ecosystem, from end to end, and to derive your value from that control. The product is not a device; the product is not a service; the product is a platform. This makes sense to them because, from birth, they were a company which sold platforms.
A platform as a product is not a device nor is it service.
Over the past few weeks, I have been developing a device that enables external control of Wirecast and XSplit. Here's a video of the device in action:
But now, let's get into the a little bit of background information:
While the TriCaster from NewTek has made great strides in decreasing the cost of video switching hardware, and can be credited with some of the rapid expansion of live streaming on the Internet, it still requires an initial investment of about $20,000 on the entry-level. Even though this is down from around 5x or 10x the cost just a few years ago for professional-grade hardware, a significant startup cost is still presented.
This brings us to my day job. For the past 4 years I have worked here at PC Perspective. My job began as an intern helping to develop video content, but quickly expanded from there. Several years ago, we decided to make the jump to live content, and started investing in the required infrastructure. Since we obviously didn't need to worry about the availability of PC Hardware, we decided to go with the software video switching route, as opposed to dedicated hardware like the TriCaster. At the time, we started experimenting with Wirecast and bought a few Blackmagic Intensity Pro HDMI capture cards for our Canon Vixia HV30 cameras. Overall, building an 6 core computer (Core i7-980x in those days) with 3 capture cards resulted in an investment of about $2500.
Advantages to the software route not only consisted of a much cheaper initial investment, we had an operation running for about a 1/10th of the cost of a TriCaster, but ultimately our setup was more expandable. If we had gone with a TriCaster we would have a fixed number of inputs, but in this configuration we could add more inputs on the fly as long as we had available I/O on our computer.
Introduction and externals
Razer maintains a distinct sense of style across their product line. Over the past decade and a half, Razer has carved a spot in the peripherals market catering to competitive gamers as well as developing wholly novel products for the gaming market. Razer has a catalog including standard peripherals and more arcane things such as mice with telephone-style keypads geared toward MMORPG players as well as motion sensing controllers employing magnetic fields to detect controller position.
The Razer BlackWidow Ultimate Stealth 2013 Edition comes out of the box ready for use without additional software provided or assembly required. The keyboard uses a standard layout with five macro keys attached in a column on the left of the board. Rather than dedicated media buttons, media and keyboard specific functions are accessed by pressing a combination of a function key located to the right of right alt and the function keys on the top row.
The headphone and microphone jack are present on the side of the keyboard.
NVIDIA Finally Gets Serious with Tegra
Tegra has had an interesting run of things. The original Tegra 1 was utilized only by Microsoft with Zune. Tegra 2 had a better adoption, but did not produce the design wins to propel NVIDIA to a leadership position in cell phones and tablets. Tegra 3 found a spot in Microsoft’s Surface, but that has turned out to be a far more bitter experience than expected. Tegra 4 so far has been integrated into a handful of products and is being featured in NVIDIA’s upcoming Shield product. It also hit some production snags that made it later to market than expected.
I think the primary issue with the first three generations of products is pretty simple. There was a distinct lack of differentiation from the other ARM based products around. Yes, NVIDIA brought their graphics prowess to the market, but never in a form that distanced itself adequately from the competition. Tegra 2 boasted GeForce based graphics, but we did not find out until later that it was comprised of basically four pixel shaders and four vertex shaders that had more in common with the GeForce 7800/7900 series than it did with any of the modern unified architectures of the time. Tegra 3 boasted a big graphical boost, but it was in the form of doubling the pixel shader units and leaving the vertex units alone.
While NVIDIA had very strong developer relations and a leg up on the competition in terms of software support, it was never enough to propel Tegra beyond a handful of devices. NVIDIA is trying to rectify that with Tegra 4 and the 72 shader units that it contains (still divided between pixel and vertex units). Tegra 4 is not perfect in that it is late to market and the GPU is not OpenGL ES 3.0 compliant. ARM, Imagination Technologies, and Qualcomm are offering new graphics processing units that are not only OpenGL ES 3.0 compliant, but also offer OpenCL 1.1 support. Tegra 4 does not support OpenCL. In fact, it does not support NVIDIA’s in-house CUDA. Ouch.
Jumping into a new market is not an easy thing, and invariably mistakes will be made. NVIDIA worked hard to make a solid foundation with their products, and certainly they had to learn to walk before they could run. Unfortunately, running effectively entails having design wins due to outstanding features, performance, and power consumption. NVIDIA was really only average in all of those areas. NVIDIA is hoping to change that. Their first salvo into offering a product that offers features and support that is a step above the competition is what we are talking about today.
A quick look at a great accessory
Though we are a PC hardware and technology website by day, we are also video creators by night (and sometimes day as well). If you don't believe me, check out our PC Perspective video tag or even our very own YouTube channel. See?!?
While we do have a big fancy studio setup for in-house production, sometimes on the road you just need something quick and easy but also high quality for recording. While our collection of DSLR cameras does amazing with video quality, the audio from the in-camera microphones has always sucked and lugging around wireless mic packs seemed unnecessary much of the time.
Enter the RODE VideoMic.
This $170 shotgun, directional microphone is from one of the most well recognized and respected companies in pro-sumer audio. In the short video below I show you what you get in the box (not much) and how much you can improve your audio with this simple add-on.
Overall, I have to say I was very impressed with the RODE VideoMic and anyone looking to improve the quality of their videos with an audio upgrade should give this option a try!
Get notified when we go live!