All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Graphics Cards, Displays, Mobile | August 21, 2014 - 05:23 PM | Ryan Shrout
Tagged: nvidia, video, live, shield, shield tablet, g-sync, gsync, tom petersen
Tomorrow at 12pm EDT / 9am PDT, NVIDIA's Tom Petersen will be stopping by the PC Perspective office to discuss some topics of interest. There has been no lack of topics floating around the world of graphics card, displays, refresh rates and tablets recently and I expect the show tomorrow to be incredibly interesting and educational.
On hand we'll be doing demonstrations of G-Sync Surround (3 panels!) with the ASUS ROG Swift PG278Q display (our review here) and also show off the SHIELD Tablet (we have a review of that too) with some multiplayer action. If you thought the experience with a single G-Sync monitor was impressive, you will want to hear what a set of three of them can be like.
NVIDIA Live Stream with Tom Petersen
9am PT / 12pm ET - August 22nd
The topic list is going to include (but not limited to):
- ASUS PG278Q G-Sync monitor
- G-Sync availability and pricing
- G-Sync Surround setup, use and requirements
- Technical issues surrounding G-Sync: latency, buffers, etc.
- Comparisons of G-Sync to Adaptive Sync
- SHIELD Tablet game play
But we want your questions! Do you have burning issues that you think need to be addressed by Tom and the NVIDIA team about G-Sync, FreeSync, GameWorks, Tegra, tablets, GPUs and more? Nothing is off limits here, though obviously Tom may be cagey on future announcements. Please use the comments section on this news post below (registration not required) to ask your questions and we can organize them before the event tomorrow. We MIGHT even be able to come up with a couple of prizes to giveaway for live viewers as well...
See you tomorrow!!
Subject: Mobile | August 15, 2014 - 02:22 PM | Jeremy Hellstrom
Tagged: Surface Pro 3, microsoft
With a 12" 2160x1440 resolution screen, a 4th generation Core i3, i5 or i7 and a full version of Win 8.1 the new Surface Pro 3 is the best tablet offered by Microsoft so far. Overall it is thinner but 1.5" larger than the Pro 3 with better resolution with a battery that should last about 8 hours while you are working, slightly longer when just browsing. The Surface Pen is a nice addition to the dock and stand we have become familiar with. Overall The Inquirer was fairly impressed with Microsoft's new offering, apart from the pricing which is rather prohibitive even before accessorizing.
"THE SURFACE PRO 3 tablet brings some of the biggest and most welcome changes seen in the Surface tablet line yet, with a bigger and better 12in HD screen, a much thinner case and an improved keyboard and kickstand, meaning its never lived more up to its motto of "the tablet that can replace your laptop."
Here are some more Mobile articles from around the web:
- Asus C200 Chromebook Review @ TechwareLabs
- ASUS X200MA: 11.6-inch Bay Trail Notebook @ SPCR
- Acer Goes Tegra K1 for Chromebook 13 @ Hardware Canucks
- Arctic Home Charger 4500 USB Adapter @ Funky Kit
- Corsair Voyager Air 2 @ Kitguru
- HUAWEI Ascend Mate2 Smart Phone Review @ Legit Reviews
Subject: General Tech, Graphics Cards, Processors, Mobile, Shows and Expos | August 13, 2014 - 09:55 PM | Scott Michaud
Tagged: siggraph 2014, Siggraph, microsoft, Intel, DirectX 12, directx 11, DirectX
Along with GDC Europe and Gamescom, Siggraph 2014 is going on in Vancouver, BC. At it, Intel had a DirectX 12 demo at their booth. This scene, containing 50,000 asteroids, each in its own draw call, was developed on both Direct3D 11 and Direct3D 12 code paths and could apparently be switched while the demo is running. Intel claims to have measured both power as well as frame rate.
Variable power to hit a desired frame rate, DX11 and DX12.
The test system is a Surface Pro 3 with an Intel HD 4400 GPU. Doing a bit of digging, this would make it the i5-based Surface Pro 3. Removing another shovel-load of mystery, this would be the Intel Core i5-4300U with two cores, four threads, 1.9 GHz base clock, up-to 2.9 GHz turbo clock, 3MB of cache, and (of course) based on the Haswell architecture.
While not top-of-the-line, it is also not bottom-of-the-barrel. It is a respectable CPU.
Intel's demo on this processor shows a significant power reduction in the CPU, and even a slight decrease in GPU power, for the same target frame rate. If power was not throttled, Intel's demo goes from 19 FPS all the way up to a playable 33 FPS.
Intel will discuss more during a video interview, tomorrow (Thursday) at 5pm EDT.
Maximum power in DirectX 11 mode.
For my contribution to the story, I would like to address the first comment on the MSDN article. It claims that this is just an "ideal scenario" of a scene that is bottlenecked by draw calls. The thing is: that is the point. Sure, a game developer could optimize the scene to (maybe) instance objects together, and so forth, but that is unnecessary work. Why should programmers, or worse, artists, need to spend so much of their time developing art so that it could be batch together into fewer, bigger commands? Would it not be much easier, and all-around better, if the content could be developed as it most naturally comes together?
That, of course, depends on how much performance improvement we will see from DirectX 12, compared to theoretical max efficiency. If pushing two workloads through a DX12 GPU takes about the same time as pushing one, double-sized workload, then it allows developers to, literally, perform whatever solution is most direct.
Maximum power when switching to DirectX 12 mode.
If, on the other hand, pushing two workloads is 1000x slower than pushing a single, double-sized one, but DirectX 11 was 10,000x slower, then it could be less relevant because developers will still need to do their tricks in those situations. The closer it gets, the fewer occasions that strict optimization is necessary.
If there are any DirectX 11 game developers, artists, and producers out there, we would like to hear from you. How much would a (let's say) 90% reduction in draw call latency (which is around what Mantle claims) give you, in terms of fewer required optimizations? Can you afford to solve problems "the naive way" now? Some of the time? Most of the time? Would it still be worth it to do things like object instancing and fewer, larger materials and shaders? How often?
Subject: General Tech, Mobile | August 11, 2014 - 08:00 AM | Tim Verry
Tagged: webgl, tegra k1, nvidia, geforce, Chromebook, Bay Trail, acer
Today Acer unveiled a new Chromebook powered by an NVIDIA Tegra K1 processor. The aptly-named Chromebook 13 is 13-inch thin and light notebook running Google’s Chrome OS with up to 13 hours of battery life and three times the graphical performance of existing Chromebooks using Intel Bay Trail and Samsung Exynos processors.
The Chromebook 13 is 18mm thick and comes in a white plastic fanless chassis that hosts a 13.3” display, full size keyboard, trackpad, and HD webcam. The Chromebook 13 will be available with a 1366x768 or 1920x1080 resolution panel depending on the particular model (more on that below).
Beyond the usual laptop fixtures, external I/O includes two USB 3.0 ports, HDMI video output, a SD card reader, and a combo headphone/mic jack. Acer has placed one USB port on the left side along with the card reader and one USB port next to the HDMI port on the rear of the laptop. Personally, I welcome the HDMI port placement as it means connecting a second display will not result in a cable invading the mousing area should i wish to use a mouse (and it’s even south paw friendly Scott!).
The Chromebook 13 looks decent from the outside, but it is the internals where the device gets really interesting. Instead of going with an Intel Bay Trail (or even Celeron/Core i3), Acer has opted to team up with NVIDIA to deliver the world’s first NVIDIA-powered Chromebook.
Specifically, the Chromebook 13 uses a NVIDIA Tegra K1 SoC, up to 4GB RAM, and up to 32GB of flash storage. The K1 offers up four A15 CPU cores clocked at 2.1GHz, and a graphics unit with 192 Kepler-based CUDA cores. Acer rates the Chromebook 13 at 11 hours with the 1080p panel or 13 hours when equipped with the 1366x768 resolution display. Even being conservative, the Chromebook 13 looks to be the new leader in Chromebook battery life (with the previous leader claiming 11 hours).
A graph comparing WebGL performance between the NVIDIA Tegra K1, Intel (Bay Trail) Celeron N2830, Samsung Exynos 5800, and Samsung Exynos 5250. Results courtesy NVIDIA.
The Tegra K1 is a powerful little chip, and it is nice to see NVIDIA get a design win here. NVIDIA claims that the Tegra K1, which is rated at 326 GFLOPS of compute performance, offers up to three times the graphics performance of the Bay Trail N2830 and Exynos 5800 SoCs. Additionally, the K1 reportedly uses slightly less power and delivers higher multi-tasking performance. I’m looking forward to seeing independent reviews in this laptop formfactor and hoping that the chip lives up to its promises.
The Chromebook 13 is currently up for pre-order and will be available in September starting at $279. The Tegra K1-powered laptop will hit the United States and Europe first, with other countries to follow. Initially, the Europe roll-out will include “UK, Netherlands, Belgium, Denmark, Sweden, Finland, Norway, France, Germany, Russia, Italy, Spain, South Africa and Switzerland.”
Acer is offering three consumer SKUs and one education SKU that will be exclusively offering through a re-seller. Please see the chart below for the specifications and pricing.
|Acer Chromebook 13 Models||System Memory (RAM)||Storage (flash)||Display||Price MSRP|
|CB5-311-T9B0||2GB||16GB||1920 x 1080||$299.99|
|CB5-311-T1UU||4GB||32GB||1920 x 1080||$379.99|
|CB5-311-T7NN - Base Model||2GB||16GB||1366 x 768||$279.99|
|Educational SKU (Reseller Only)||4GB||16GB||1366 x 768||$329.99|
Intel made some waves in the Chromebook market earlier this year with the announcement of several new Intel-powered Chrome devices and the addition of conflict-free Haswell Core i3 options. It seems that it is now time for the ARM(ed) response. I’m interested to see how NVIDIA’s newest model chip stacks up to the current and upcoming Intel x86 competition in terms of graphics power and battery usage.
As far as Chromebooks go, if the performance is at the point Acer and NVIDIA claim, this one definitely looks like a decent option considering the price. I think a head-to-head between the ASUS C200 (Bay Trail N2830, 2GB RAM, 16GB eMMC, and 1366x768 display at $249.99 MSRP) and Acer Chromebook 13 would be interesting as the real differentiator (beyond aesthetics) is the underlying SoC. I do wish there was a 4GB/16GB/1080p option in the Chromebook 13 lineup though considering the big price jump to get 4GB RAM (mostly as a result of the doubling of flash) in the $379.99 model at, say, $320 MSRP.
Read more about Chromebooks at PC Perspective!
Subject: Mobile | August 5, 2014 - 02:24 PM | Jeremy Hellstrom
Tagged: msi, WS60, mobile workstation, Quadro K2100M
Weighing in under 5lbs and thinner than 1" the MSI WS60 is very small but yet houses a Quadro K2100M which is powerful enough for professional design work. With Thunderbolt 2 connectivity it is capable of outputting 4k video and the Super RAID will give you very impressive performance even when working with large files.
City of Industry, Calif. – August 5, 2014 – MSI Computer Corp, a leading manufacturer of computer hardware products and solutions, unveils the world’s thinnest and lightest mobile workstation, the WS60. Powered by state-of-the-art technologies, including NVIDIA Quadro K2100M 3D graphics, Intel Core i7 processor and MSI’s Super RAID technology, MSI’s newest workstation weighs only 4.36 lbs., measures less than 0.8-inches thick, and delivers superior performance in an unprecedented sexy design.
“The WS60 is the perfect workstation for mobile designers and CAD CAM Engineers,” says Andy Tung, president of MSI Pan America. “It combines the versatility of an ultrabook with the performance of a workstation and takes it to another level with Thunderbolt connectivity, MSI’s Shortcut Manager, and an array of gaming components.”
MSI’s WS60 offers the fastest processing speed available in an ultra-slim workstation. Featuring the latest 4th generation CPU from Intel and NVIDIA Quadro K2100 professional graphics, the WS60 can blaze through even the most demanding tasks. MSI enhances its prowess by adding Super RAID to seamlessly integrate 2x SSDs and 1x HDD storage for over 1000 MB/s of read speed, dual fan technology for maximum heat dissipation while remaining whisper quiet, and Thunderbolt 2 connectivity. Thunderbolt 2 comes with 4K and 3D video output and dramatically increases transfer speeds up to 20 GB/s, 4 times faster than a USB 3.0 port.
The WS60 sheds the bulky and boxy design of traditional workstations. Inspired by the acclaimed look of MSI’s gaming notebooks, the WS60 has a full metallic body with MG-Li alloy parts that guarantee utmost durability, visual appeal and feather-like weight. Measuring less than 0.8-inches thick and weighing only 4.36 lbs., the WS60 is a featherweight notebook with ultra-heavyweight performance.
Creating digital art requires a screen that accurately displays true-to-life colors and captures every minute detail. MSI has outfitted the WS60 with a vibrant WQHD+ 3K display for those who demand a high level of detail in their work.
Certified for Professionals
All MSI workstation laptops, including the WS60, guarantee optimal performance with professional 3D programs like SolidWorks and more with certification from these software giants.
MSI’s WS60 is packed with professional-grade parts, including a SteelSeries full-color backlight keyboard with Anti-Ghost keys to guarantee superior tactile feedback, Killer Game Networking chip to optimize internet bandwidth, and MSI’s Shortcut Manager. MSI’s Shortcut Manager allows designers and engineers to program keys and combines multiple keys into a single command key, increasing efficiency and speed.
The WS60 is currently available in two different configurations starting at $2,299.99.
Subject: Mobile | July 29, 2014 - 06:22 PM | Jeremy Hellstrom
Tagged: asus, memo pad ME176C, android 4.2.2, Bay Trail
Powered by a Bay Trail Atom Z3745, 1GB LP DDR3-1066, 16GB eMMC, with support for up to a 64GB SD card and a 7" 1280x800 IPS display the ASUS Memo Pad ME176C is rather impressive for under $150. Shipping with Android 4.2.2 or 4.4 the Memo Pad is not quite as powerful as NVIDIA's new tablet but is nowhere near as expensive either. The Tech Report rather liked this device, as did Ryan; for those on a tight budget the new Memo does just about everything you need for basic usage at an acceptable level of performance.
"Despite its $149 asking price, Asus' Memo Pad ME176C tablet has a quad-core Bay Trail SoC, a 7" IPS display, and little extras like a Micro SD slot and GPS functionality. We take a quick look at this budget slate to see how well Android runs on x86 hardware--and whether a $149 tablet can deliver a good experience."
Here are some more Mobile articles from around the web:
- EVGA Tegra NOTE 7 & ASUS Transformer Pad TF701T Review @ Neoseeker
- Lenovo Yoga 10 HD+ Android Tablet @ Benchmark Reviews
- Festival tech: Leading the charge @ The Inquirer
- Patriot Fuel+ Mobile Rechargeable Battery Review @ HiTech Legion
- Kingston MobileLite Wireless G2 Review @ Legit Reviews
- Nokia Lumia 520 Smartphone Review @ Hardware Secrets
Subject: General Tech, Mobile | July 24, 2014 - 10:04 PM | Scott Michaud
Tagged: shield tablet, shield, nvidia
Just a small note to continue with our SHIELD Tablet coverage. It turns out that the $299 (16GB) SHIELD Tablet, its cover, and its wireless controller are all available for pre-order on Amazon. The unit will actually be available on July 29th, but we were not aware that pre-orders would be possible until now.
While Ryan wrote a preview for the SHIELD Tablet, he is not giving a final word until he gets it into his lab and is capable of giving a full review. Also, we do not know how many units will be available. Whether you should pre-order, or wait for Ryan's final word, is up to you.
Thanks to our fans for alerting us of this availabilty in the IRC during TWiCH.
Subject: General Tech, Mobile | July 24, 2014 - 02:32 PM | Jeremy Hellstrom
Tagged: Intel, microsoft, netbook, Bay Trail
According to DigiTimes we may see a resurgence of netbooks, this time powered by Bay Trail which will make them far more usable than the original generation. There are three postulated tiers, the $200-250 range of 10.1-15.6" models and $250-400 or $400-600 in 11.6-17.3" which will make them larger in size than the original generation which failed to attract many consumers. They are currently scheduled to ship with Bay Trail-M with future models likely to have Braswell inside in a mix of transformer style 2 in 1's with touchscreens and more traditional laptop designs. You can expect to see a maximum thickness of 25mm and a mix of HDD and SSD storage on these and we can only hope that the estimated pricing is more accurate than the pricing on Ultrabooks turned out to be.
"For the US$199-249 notebooks, Intel and Microsoft's specification preferences are 10.1- to 15.6-inch clamshell non-touchscreen models using Intel's Bay Trail-M series processors or upcoming Braswell-based processors, which are set to release in the second quarter of 2015."
Here is some more Tech News from around the web:
- GOG.com Announces Linux Support @ Slashdot
- FCC Reminds ISPs That They Can Be Fined For Lacking Transparency @ Slashdot
- Apple to become largest client for TSMC, say sources @ DigiTimes
- Oracle releases its 'unbreakable' homebrew Oracle Linux 7 @ The Inquirer
- Microsoft: We're making ONE TRUE WINDOWS to rule us all @ The Register
- FRIKKIN' LASERS could REPLACE fibre-optic comms cables @ The Register
Subject: General Tech | July 23, 2014 - 01:40 PM | Tim Verry
Tagged: xiaomi, snapdragon 801, smartphone, mobile, LTE, Android 4.4.2
Yesterday, Xiaomi revealed a powerful smartphone called the Mi4 that looks to give the unlocked OnePlus One a run for its money. The new smartphone is launching first in China with an international version coming in the future.
The Xiaomi Mi4 features a 5" 1080p IPS LCD display, 13MP rear camera, and 8MP webcam. A metal band surrounds the outside edges of the phone while a stainless steel frame adds rigidity and protection for the internal hardware. The other bits of the case are plastic, however likely due to weight and signal reception concerns. There is a removable back cover that is available in several different designs and colors. The Mi4 is slightly bulkier than its predecessor at 0.35-inches thick and 149 grams.
Internally, the Mi4 uses a Qualcomm Snapdragon 801 SoC with four Krait 400 CPU cores clocked at 2.5GHz and an Adreno 330 GPU. Further, the smartphone features 3GB of RAM and either 16GB or 64GB of internal storage. It is powered by a 3,080 mAh battery which should provide ample battery life. Wireless connectivity includes dual band 802.11b/g/n/ac Wi-Fi, Bluetooth 4.0, NFC, 3G, and LTE. The WCDMA version of the smartphone will be available first with a CDMA version coming next month, and a 4G LTE capable device coming in September.
The smartphone runs Android 4.4.2 with a highly customized MIUI5 user interface. An updated version of the UI, called MIUI6 is reportedly coming in August, but it is unclear how soon Mi4 users can expect an upgrade.
The Xiaomi Mi4 will be available on July 29 for 1,999 Yuan ($322 USD) for the 16GB version and 2,499 Yuan ($403) for the 64GB version. Initially, it will be 3G only, but a 4G LTE capable version of the smartphone is coming in September (presumably for the same price). Even further out, an unlocked international version is said to be available for purchase in the future.
In all, the Mi4 looks to be a decent phone with enough design tweaks and hardware oomph to give existing high end smartphones a run for their money. You do sacrifice micro SD card support and stock Android, but if you can live with that and are in the target market (or can wait for an international version) it is worth keeping an eye on!
Subject: General Tech, Mobile | July 20, 2014 - 03:41 AM | Scott Michaud
Tagged: shield tablet, shield 2, shield, nvidia
In Europe, NVIDIA set up a simple, official page. Its title is "THE ULTIMATE IS COMING | NVIDIA". On it are the words, "The Ultimate Is Coming" as well as a countdown to 9 AM EDT on Tuesday, July 22nd. At the same time, North Americans get "Ultimate Quest", a text adventure game which ends on -- surprise -- Tuesday with a giveaway of "something big" for the first players who finish.
Allegedly leaked slide - Not Official!
All images credit: Videocardz (and they have more).
What is it? It is very likely the rumored SHIELD tablet, especially considering Videocardz has convincing slides which are definitely made in NVIDIA's style. What made SHIELD so unique was its controller form factor (and NVIDIA's software support). According to the slides, the controller will now be a wireless accessory to the base tablet. What will come standard is a stylus, the "DirectStylus 2 with 3D Paint". This seems like an odd addition, unless they have already planned use cases.
Allegedly leaked slide - Not Official!
As for the tablet? The slides claim Tegra K1, 2GB of RAM, 1920x1200 display, 5MP (HDR) front and rear cameras, and a MicroSD card slot. Previously, leaks suggested a 640x480 front-facing camera, which did not make sense to me. With the original SHIELD lacking a camera, it seemed very odd to relaunch with a bad one. 5MP, especially if it is a good sensor, is much more reasonable (especially for the front-facing one).
Allegedly leaked slide - Not Official!
The leaks also suggest interesting price points, especially for such a powerful tablet. These details can change at a moment's notice, though, so I won't really acknowledge it (apart from embedding the slide below). They do seem to be targeting the end of the month for North America, or middle August for Europe, which is a very quick launch. Then again, it is ready enough to be a prize for a contest which ends on Tuesday.
Allegedly leaked slide - Not Official!
I except we will see how much of this, if anything, holds up on Tuesday.
Subject: General Tech, Graphics Cards, Mobile | July 19, 2014 - 03:29 AM | Scott Michaud
Tagged: nvidia, geforce, maxwell, mobile gpu, mobile graphics
Apparently, some hardware sites got their hands on an NVIDIA driver listing with several new product codes. They claim thirteen N16(P/E) chips are listed (although I count twelve (??)). While I do not have much knowledge of NVIDIA's internal product structure, the GeForce GTX 880M, based on Kepler, is apparently listed as N15E.
Things have changed a lot since this presentation.
These new parts will allegedly be based on the second-generation Maxwell architecture. Also, the source believes that these new GPUs will in the GeForce GTX 800-series, possibly with the MX suffix that was last seen in October 2012 with the GeForce GTX 680MX. Of course, being a long-time PC gamer, the MX suffix does not exactly ring positive with my memory. It used to be the Ti-line that you wanted, and the MX-line that you could afford. But who am I kidding? None of that is relevant these days. Get off my lawn.
Subject: General Tech, Mobile | July 16, 2014 - 01:30 PM | Jeremy Hellstrom
Tagged: win8 mobile, win 8.1, nokia, lumia, skype, microsoft, cyan
If you own a Nokia Lumia phone which runs Win 8.1 then there is an update with quite a few interesting features available. US customers will see a Cortana update while all users will gain the ability to search their phone with Bing, an IE11 update, encrypted S/MIME and improved VPN support. There are quite a few app updates and users of the 1520 and Icon get Nokia Rich Recording and Dolby Digital Plus 5.1 which improves both audio and video options. Check out Cyan at The Register but don't stop there because according to The Inquirer you will no longer need to be a Premium Skype member to make group calls from your mobile device as it is now a free feature for all.
"Nokia is rolling out Windows Phone 8.1 as an over-the-air update that comes bundled with "Cyan," a special feature package that's exclusive to Lumia devices."
Here is some more Tech News from around the web:
- The Linux Supercomputers We Secretly Fear Will Become Sentient @ Linux.com
- Linksys WRT1900AC Dual Band Smart WiFi Wireless AC Router Review @ Legit Reviews
- Can nothing trip up the runaway cash monster that is Intel? Well... @ The Register
- ASMedia, Asustek executives suspected of insider trading @ DigiTimes
Subject: General Tech, Mobile | July 16, 2014 - 04:11 AM | Scott Michaud
Tagged: google, google play, Android, android l
If you have looked at Google's recent design ideologies, first announced at Google I/O 2014, you will see them revolve around skeuomorphism in its most basic sense. By that, I do not mean that they want to make it look like a folder, a metal slab, or a radio button. Their concept is that objects should look like physical objects which behave with physical accuracy, even though they are just simulations of light.
Image Credit: Android Police (and their source)
Basically, rather than having a panel with a toolbar, buttons, and columns, have a background with a page on it. Interface elements which are affected by that panel are on it, while more global actions are off of it. According to Android Police, who make clear that they do not have leaked builds and readers should not believe anything until/unless it ships, the Google Play Store will be redesigned with this consistent, albeit broad, design metric.
Basically, if you are navigation bar, pack your desk and get out.
If true, when will these land? Anyone's guess. One speculation is that it will be timed with the release of Android "L" in Autumn. Their expectation, however, is that it will be one of many updates Google will make across their products in a rolling pattern. Either way, I think it looks good... albeit similar to many modern websites.
Subject: General Tech, Processors, Mobile | July 16, 2014 - 03:37 AM | Scott Michaud
Tagged: quarterly results, quarterly earnings, quarterly, Intel, earnings
Another fiscal quarter brings another Intel earnings report. Once again, they are doing well for themselves as a whole but are struggling to gain a foothold in mobile. In three months, they sold 8.7 billion dollars in PC hardware, of which 3.7 billion was profit. Its mobile division, on the other hand, brought in 51 million USD in revenue, losing 1.1 billion dollars for their efforts. In all, the company is profitable -- by about 3.84 billion USD.
One interesting metric which Intel adds to their chart, and I have yet to notice another company listing this information so prominently, is their number of employees, compared between quarters. Last year, Intel employed about 106,000 people, which increased to 106,300 two quarters ago. Between two quarters ago and this last quarter, that number dropped by 1400, to 104,900 employees, which was about 1.3% of their total workforce. There does not seem to be a reason for this decline (except for Richard Huddy, we know that he went to AMD).
Image Credit: Anandtech
As a final note, Anandtech, when reporting on this story, added a few historical trends near the end. One which caught my attention was the process technology vs. quarter graph, demonstrating their smallest transistor size over the last thirteen-and-a-bit years. We are still slowly approaching 0nm, following an exponential curve as it approaches its asymptote. The width, however, is still fairly regular. It looks like it is getting slightly longer, but not drastically (minus the optical illusion caused by the smaller drops).
Subject: General Tech, Displays, Mobile | July 15, 2014 - 05:39 PM | Jeremy Hellstrom
Tagged: displaylink, club 3d, 4k
Why would you want a USB 3.0 4K display adapter you might ask? Perhaps you have an ultrabook with limited display outputs that do not output in 4K resolution but somehow you managed to get your hands on a 4K display for work or leisure and have a need for the full resolution. Club 3D now has a family of USB adapters for you, the CSV-2302 USB 3.0 to DisplayPort 4K, CSV-2301 USB 3.0 to DisplayPort 1600p and the CSV-2300D USB 3.0 to DVI-I graphics adapters. This is the first implementation of the DisplayLink DL-5500 chipset and it does indeed support 10bit colour if your display can handle it.
The MSRP for this device when it starts to ship in about 2 weeks will be ~$142.
Club 3D officially launches the next generation of USB 3.0 Graphics adapters capable of outputting high resolutions to DVI-I (2048x 1152p), DisplayPort (2560x 1600p) and the world’s first USB 3.0 to DisplayPort Graphics (CSV-2302) adapter which supports 4K or Ultra High Definition resolution at 3840x 2160p.
The Universal Serial Bus (USB) Port of a desktop computer or notebook is multifunctional and can be used to connect a large variety of (storage) devices, keyboards, mice and and other peripherals like monitors. Back in 2011, Club 3D introduced its first SenseVision USB Graphics adapters. These small external graphics adapters can be used to connect a DVI or HDMI monitor to the USB 2.0 output of a Desktop Computer or Notebook and create a multi screen setup.
The SenseVision USB adapters proved to be very successful across the globe! Not only with travelers but also in (semi) professional environments where more monitors mean more productivity.
The new Club 3D USB 3.0 Graphics adapters are fully ‘Plug and Display’ certified and the USB 3.0 to 4K Graphics Adapter (CSV-2302) is the very first to use the brand new DisplayLink DL-5500 chipset enabling 4K Ultra High Definition output to DisplayPort enabled 4K monitors at 30Hz. The Club 3D USB 3.0 to 4K Graphics Adapter (CSV-2302) is the first device available worldwide with the revolutionary new DisplayLink SoC implemented.
This Graphics adapter uses little resources of your system so it won’t affect performance ensuring at the same time a great image quality. It’s the ideal solution for anyone wanting to expand desktop space in order to use multiple programs simultaneously.
- 3840x2160 output at 30Hz
- Backwards compatible with QHD and HD monitors
- DP 1.2 interface (DisplayPort)
- HDCP 2.0 for protected video playback
- Integrated DisplayPort Audio
Subject: Mobile | July 14, 2014 - 03:46 PM | Jeremy Hellstrom
Tagged: memo pad 7, memopad, asus, Android 4.4.2
The ASUS MeMO Pad 7 has a 7" 1280x800 IPS display, a BayTrail Atom Z3745 Quad-Core that can run up to 1.86GHz, 1GB of RAM and 16GB of internal storage with support for SD cards up to 32GB. All in all this seems like the stats you would expect from a $150 tablet, but the challenge is to be usable enough to not be returned. Legit Reviews tested out this tablet and were impressed by the graphics performance of the new Atom but were disappointed by the WiFi speeds which were significantly slower than their preferred tablet, the ~$200 Nexus 7.
"Budget friendly Android tablets are a dime a dozen these days, but they all aren’t created equally and there are some very bad tablets out there. When you get into the sub $150 tablet market you need to be very careful with what tablet you go with as companies start cutting costs by reducing the hardware specifications and that can lead to subpar performance and an overall bad user experience. If you’ve ever purchased an inexpensive tablet thinking that they were all the same, you usually find out in under three minutes that you screwed up and will be running to return it."
Here are some more Mobile articles from around the web:
- Adata PV100 4200mAh USB Battery @ eTeknix
- Kingston Mobilelite Wireless G2 @ Hardware Asylum
- Thermaltake Massive Notebook Coolers (V20, SP, TM) Review @ OCC
- Steelseries Stratus Bluetooth iOS Mobile Gaming Controller @ eeTeknix
- Motorola Moto G Smartphone Review @ Hardware Secrets
Subject: General Tech, Processors, Mobile | July 11, 2014 - 04:58 PM | Scott Michaud
Tagged: x86, VIA, isaiah II, Intel, centaur, arm, amd
There might be a third, x86-compatible processor manufacturer who is looking at the mobile market. Intel has been trying to make headway, including the direct development of Android for the x86 architecture. The company also has a few design wins, mostly with Windows 8.1-based tablets but also the occasional Android-based models. Google is rumored to be preparing the "Nexus 8" tablet with one of Intel's Moorefield SoCs. AMD, the second-largest x86 processor manufacturer, is aiming their Mullins platform at tablets and two-in-ones, but cannot afford to play snowplow, at least not like Intel.
VIA, through their Centaur Technology division, is expected to announce their own x86-based SoC, too. Called Isaiah II, it is rumored to be a quad core, 64-bit processor with a maximum clock rate of 2.0 GHz. Its GPU is currently unknown. VIA sold their stake S3 Graphics to HTC back in 2011, who then became majority shareholder over the GPU company. That said, HTC and VIA are very close companies. The chairwoman of HTC is the founder of VIA Technologies. The current President and CEO of VIA, who has been in that position since 1992, is her husband. I expect that the GPU architecture will be provided by S3, or will somehow be based on their technology. I could be wrong. Both companies will obviously do what they think is best.
It would make sense, though, especially if it benefits HTC with cheap but effective SoCs for Android and "full" Windows (not Windows RT) devices.
Or this announcement could be larger than it would appear. Three years ago, VIA filed for a patent which described a processor that can read both x86 and ARM machine language and translate it into its own, internal microinstructions. The Centaur Isaiah II could reasonably be based on that technology. If so, this processor would be able to support either version of Android. Or, after Intel built up the Android x86 code base, maybe they shelved that initiative (or just got that patent for legal reasons).
But what about Intel? Honestly, I see this being a benefit for the behemoth. Extra x86-based vendors will probably grow the overall market share, compared to ARM, by helping with software support. Even if it is compatible with both ARM and x86, what Intel needs right now is software. They can only write so much of it themselves. It is possible that VIA, being the original netbook processor, could disrupt the PC market with both x86 and ARM compatibility, but I doubt it.
Centaur Technology, the relevant division of VIA, will make their announcement in less than 51 days.
Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | July 7, 2014 - 04:06 AM | Scott Michaud
Tagged: tegra k1, OpenGL ES, opengl, Khronos, google io, google, android extension pack, Android
Sure, this is a little late. Honestly, when I first heard the announcement, I did not see much news in it. The slide from the keynote (below) showed four points: Tesselation, Geometry Shaders, Computer [sic] Shaders, and ASTC Texture Compression. Honestly, I thought tesselation and geometry shaders were part of the OpenGL ES 3.1 spec, like compute shaders. This led to my immediate reaction: "Oh cool. They implemented OpenGL ES 3.1. Nice. Not worth a news post."
Image Credit: Blogogist
Apparently, they were not part of the ES 3.1 spec (although compute shaders are). My mistake. It turns out that Google is cooking their their own vendor-specific extensions. This is quite interesting, as it adds functionality to the API without the developer needing to target a specific GPU vendor (INTEL, NV, ATI, AMD), waiting for approval from the Architecture Review Board (ARB), or using multi-vendor extensions (EXT). In other words, it sounds like developers can target Google's vendor without knowing the actual hardware.
Hiding the GPU vendor from the developer is not the only reason for Google to host their own vendor extension. The added features are mostly from full OpenGL. This makes sense, because it was announced with NVIDIA and their Tegra K1, Kepler-based SoC. Full OpenGL compatibility was NVIDIA's selling point for the K1, due to its heritage as a desktop GPU. But, instead of requiring apps to be programmed with full OpenGL in mind, Google's extension pushes it to OpenGL ES 3.1. If the developer wants to dip their toe into OpenGL, then they could add a few Android Extension Pack features to their existing ES engine.
Epic Games' Unreal Engine 4 "Rivalry" Demo from Google I/O 2014.
The last feature, ASTC Texture Compression, was an interesting one. Apparently the Khronos Group, owners of OpenGL, were looking for a new generation of texture compression technologies. NVIDIA suggested their ZIL technology. ARM and AMD also proposed "Adaptive Scalable Texture Compression". ARM and AMD won, although the Khronos Group stated that the collaboration between ARM and NVIDIA made both proposals better than either in isolation.
Android Extension Pack is set to launch with "Android L". The next release of Android is not currently associated with a snack food. If I was their marketer, I would block out the next three versions as 5.x, and name them (L)emon, then (M)eringue, and finally (P)ie.
Would I do anything with the two skipped letters before pie? (N)(O).
Subject: Mobile | July 3, 2014 - 02:54 PM | Jeremy Hellstrom
Tagged: kingston, MobileLite Wireless G2
The Kingston MobileLite Wireless G2 is hard to describe quickly, you can plug memory cards or USB flash drives into it and access them with a wireless device, you can plug in an ethernet cord and use it as a wireless router and you can plug USB devices into it to recharge them. Often these all in one devices tend towards being able to do several things poorly as opposed to one thing very well but in this case it seems Kingston has pulled it off. Techgage was not terribly impressed with the features of the software but the utilitarian nature of the interface does keep things simple.
"There are mobile media readers, and then there’s Kingston’s MobileLite Wireless G2. When not serving files over Wi-Fi, it can accept a wired LAN connection to become a travel router, and it can also use its huge battery to help charge your mobile phone while you’re on-the-go. Who doesn’t love a device that can act as a jack-of-all-trades?"
Here are some more Mobile articles from around the web:
- MSI GP70-2PE ‘Leopard’ Gaming Notebook @ eTeknix
- MSI GS60 2PE Ghost Pro @ Kitguru
- MSI GS60-2PE ‘Ghost Pro’ Gaming Notebook @ eTeknix
- PC Specialist Cosmos II @ eTeknix
- MSI GT70-2PE ‘Dominator Pro’ Gaming Notebook @ eTeknix
- Sony Xperia T2 Ultra Dual Smartphone Review @ Hardware Secrets
- Samsung Galaxy Tab Pro 10.1 @ The Inquirer
- LG G3 @ The Inquirer
Subject: Mobile | July 2, 2014 - 12:00 PM | Ryan Shrout
Tagged: linux, linaro, juno, google, armv8-a, ARMv8, arm, android l
Even though Apple has been shipping a 64-bit capable SoC since the release of the A7 part in September of 2013, the Android market has yet to see its first consumer 64-bit SoC release. That is about to change as we progress through the rest of 2014 and ARM is making sure that major software developers have the tools they need to be ready for the architecture shift. That help is will come in the form of the Juno ARM Development Platform (ADP) and 64-bit ready software stack.
Apple's A7 is the first core to implement ARMv8 but companies like Qualcomm, NVIDIA and course ARM have their own cores based on the 64-bit architecture. Much like we saw the with the 64-bit transition in the x86 ecosystem, ARMv8 will improve access to large datasets, will result in gains in performance thanks to increased register sizes, larger virtual address spaces above 4GB and more. ARM also improved performance of NEON (SIMD) and cryptography support while they were in there fixing up the house.
The Juno platform is the first 64-bit development platform to come directly from ARM and combines a host of components to create a reference hardware design for integrators and developers to target moving forward. Featuring a test chip built around Cortex-A57 (dual core), Cortex-A53 (quad core) and Mali-T624 (quad core), Juno allows software to target 64-bit development immediately without waiting for other SoC vendors to have product silicon ready. The hardware configuration implements big.LITTLE, OpenGL ES3.0 support, thermal and power management, Secure OS capability and more. In theory, ARM has built a platform that will be very similar to SoCs built by its partners in the coming months.
ARM isn't quite talking about the specific availability of the Juno platform, but for the target audience ARM should be able to provide the amount of development platforms necessary. Juno enables software development for 64-bit kernels, drivers, and tools and virtual machine hypervisors but it's not necessarily going to help developers writing generic applications. Think of Juno as the development platform for the low level designers and coders, not those that are migrating Facebook or Flappy Bird to your next smartphone.
The Juno platform helps ARM in a couple of specific ways. From a software perspective, it creates common foundation for the ARMv8 ecosystem and allows developer access to silicon before ARM's partners have prepared their own platforms. ARM claims that Juno is a fairly "neutral" platform so software developers won't feel like they are being funneled in one direction. I'd be curious what ARM's partners actually think about that though with the inclusion of Mali graphics, a product that ARM is definitely trying to promote in a competitive market.
Though the primary focus might be software, hardware partners will be able to benefit from Juno. On this board they will find the entire ARMv8 IP portfolio tested up to modern silicon. This should enable hardware vendors to see A57 and A53 working, in action and with the added benefit of a full big.LITTLE implementation. The hope is that this will dramatically accelerate the time to market for future 64-bit ARM designs.
The diagram above shows the full break down of the Juno SoC as well as some of the external connectivity on the board itself. The memory system is built around 8GB of DDR3 running at 12.8 GB/s and the is extensible through the PCI Express slots and the FPGA options.
Of course hardware is only half the story - today Linaro is releasing a 64-bit port of the Android Open Source Project (AOSP) that will run on Juno. That, along with the Linux kernel v3.14 with ARMv8-A support should give developers the tools needed to write the applications, middleware and kernels for future hardware. Also worth noting on June 25th at Google I/O was the announcement of developer access coming for Android L. This build will support ARMv8-A as well.
The switch to 64-bit technology on ARM devices isn't going to happen overnight but ARM and its partners have put together a collective ecosystem that will allow the software and hardware developers to make transition as quick and, most importantly, as painless as possible. With outside pressure pushing on ARM and its low power processor designs, it is taking more of its fate in its own hands, pushing the 64-bit transition forward at an accelerated pace. This helps ARM in the mobile space, the consumer space as well as the enterprise markets, a key market for SoC growth.