All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech, Processors, Mobile | July 11, 2014 - 04:58 PM | Scott Michaud
Tagged: x86, VIA, isaiah II, Intel, centaur, arm, amd
There might be a third, x86-compatible processor manufacturer who is looking at the mobile market. Intel has been trying to make headway, including the direct development of Android for the x86 architecture. The company also has a few design wins, mostly with Windows 8.1-based tablets but also the occasional Android-based models. Google is rumored to be preparing the "Nexus 8" tablet with one of Intel's Moorefield SoCs. AMD, the second-largest x86 processor manufacturer, is aiming their Mullins platform at tablets and two-in-ones, but cannot afford to play snowplow, at least not like Intel.
VIA, through their Centaur Technology division, is expected to announce their own x86-based SoC, too. Called Isaiah II, it is rumored to be a quad core, 64-bit processor with a maximum clock rate of 2.0 GHz. Its GPU is currently unknown. VIA sold their stake S3 Graphics to HTC back in 2011, who then became majority shareholder over the GPU company. That said, HTC and VIA are very close companies. The chairwoman of HTC is the founder of VIA Technologies. The current President and CEO of VIA, who has been in that position since 1992, is her husband. I expect that the GPU architecture will be provided by S3, or will somehow be based on their technology. I could be wrong. Both companies will obviously do what they think is best.
It would make sense, though, especially if it benefits HTC with cheap but effective SoCs for Android and "full" Windows (not Windows RT) devices.
Or this announcement could be larger than it would appear. Three years ago, VIA filed for a patent which described a processor that can read both x86 and ARM machine language and translate it into its own, internal microinstructions. The Centaur Isaiah II could reasonably be based on that technology. If so, this processor would be able to support either version of Android. Or, after Intel built up the Android x86 code base, maybe they shelved that initiative (or just got that patent for legal reasons).
But what about Intel? Honestly, I see this being a benefit for the behemoth. Extra x86-based vendors will probably grow the overall market share, compared to ARM, by helping with software support. Even if it is compatible with both ARM and x86, what Intel needs right now is software. They can only write so much of it themselves. It is possible that VIA, being the original netbook processor, could disrupt the PC market with both x86 and ARM compatibility, but I doubt it.
Centaur Technology, the relevant division of VIA, will make their announcement in less than 51 days.
Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | July 7, 2014 - 04:06 AM | Scott Michaud
Tagged: tegra k1, OpenGL ES, opengl, Khronos, google io, google, android extension pack, Android
Sure, this is a little late. Honestly, when I first heard the announcement, I did not see much news in it. The slide from the keynote (below) showed four points: Tesselation, Geometry Shaders, Computer [sic] Shaders, and ASTC Texture Compression. Honestly, I thought tesselation and geometry shaders were part of the OpenGL ES 3.1 spec, like compute shaders. This led to my immediate reaction: "Oh cool. They implemented OpenGL ES 3.1. Nice. Not worth a news post."
Image Credit: Blogogist
Apparently, they were not part of the ES 3.1 spec (although compute shaders are). My mistake. It turns out that Google is cooking their their own vendor-specific extensions. This is quite interesting, as it adds functionality to the API without the developer needing to target a specific GPU vendor (INTEL, NV, ATI, AMD), waiting for approval from the Architecture Review Board (ARB), or using multi-vendor extensions (EXT). In other words, it sounds like developers can target Google's vendor without knowing the actual hardware.
Hiding the GPU vendor from the developer is not the only reason for Google to host their own vendor extension. The added features are mostly from full OpenGL. This makes sense, because it was announced with NVIDIA and their Tegra K1, Kepler-based SoC. Full OpenGL compatibility was NVIDIA's selling point for the K1, due to its heritage as a desktop GPU. But, instead of requiring apps to be programmed with full OpenGL in mind, Google's extension pushes it to OpenGL ES 3.1. If the developer wants to dip their toe into OpenGL, then they could add a few Android Extension Pack features to their existing ES engine.
Epic Games' Unreal Engine 4 "Rivalry" Demo from Google I/O 2014.
The last feature, ASTC Texture Compression, was an interesting one. Apparently the Khronos Group, owners of OpenGL, were looking for a new generation of texture compression technologies. NVIDIA suggested their ZIL technology. ARM and AMD also proposed "Adaptive Scalable Texture Compression". ARM and AMD won, although the Khronos Group stated that the collaboration between ARM and NVIDIA made both proposals better than either in isolation.
Android Extension Pack is set to launch with "Android L". The next release of Android is not currently associated with a snack food. If I was their marketer, I would block out the next three versions as 5.x, and name them (L)emon, then (M)eringue, and finally (P)ie.
Would I do anything with the two skipped letters before pie? (N)(O).
Subject: Mobile | July 3, 2014 - 02:54 PM | Jeremy Hellstrom
Tagged: kingston, MobileLite Wireless G2
The Kingston MobileLite Wireless G2 is hard to describe quickly, you can plug memory cards or USB flash drives into it and access them with a wireless device, you can plug in an ethernet cord and use it as a wireless router and you can plug USB devices into it to recharge them. Often these all in one devices tend towards being able to do several things poorly as opposed to one thing very well but in this case it seems Kingston has pulled it off. Techgage was not terribly impressed with the features of the software but the utilitarian nature of the interface does keep things simple.
"There are mobile media readers, and then there’s Kingston’s MobileLite Wireless G2. When not serving files over Wi-Fi, it can accept a wired LAN connection to become a travel router, and it can also use its huge battery to help charge your mobile phone while you’re on-the-go. Who doesn’t love a device that can act as a jack-of-all-trades?"
Here are some more Mobile articles from around the web:
- MSI GP70-2PE ‘Leopard’ Gaming Notebook @ eTeknix
- MSI GS60 2PE Ghost Pro @ Kitguru
- MSI GS60-2PE ‘Ghost Pro’ Gaming Notebook @ eTeknix
- PC Specialist Cosmos II @ eTeknix
- MSI GT70-2PE ‘Dominator Pro’ Gaming Notebook @ eTeknix
- Sony Xperia T2 Ultra Dual Smartphone Review @ Hardware Secrets
- Samsung Galaxy Tab Pro 10.1 @ The Inquirer
- LG G3 @ The Inquirer
Subject: Mobile | July 2, 2014 - 12:00 PM | Ryan Shrout
Tagged: linux, linaro, juno, google, armv8-a, ARMv8, arm, android l
Even though Apple has been shipping a 64-bit capable SoC since the release of the A7 part in September of 2013, the Android market has yet to see its first consumer 64-bit SoC release. That is about to change as we progress through the rest of 2014 and ARM is making sure that major software developers have the tools they need to be ready for the architecture shift. That help is will come in the form of the Juno ARM Development Platform (ADP) and 64-bit ready software stack.
Apple's A7 is the first core to implement ARMv8 but companies like Qualcomm, NVIDIA and course ARM have their own cores based on the 64-bit architecture. Much like we saw the with the 64-bit transition in the x86 ecosystem, ARMv8 will improve access to large datasets, will result in gains in performance thanks to increased register sizes, larger virtual address spaces above 4GB and more. ARM also improved performance of NEON (SIMD) and cryptography support while they were in there fixing up the house.
The Juno platform is the first 64-bit development platform to come directly from ARM and combines a host of components to create a reference hardware design for integrators and developers to target moving forward. Featuring a test chip built around Cortex-A57 (dual core), Cortex-A53 (quad core) and Mali-T624 (quad core), Juno allows software to target 64-bit development immediately without waiting for other SoC vendors to have product silicon ready. The hardware configuration implements big.LITTLE, OpenGL ES3.0 support, thermal and power management, Secure OS capability and more. In theory, ARM has built a platform that will be very similar to SoCs built by its partners in the coming months.
ARM isn't quite talking about the specific availability of the Juno platform, but for the target audience ARM should be able to provide the amount of development platforms necessary. Juno enables software development for 64-bit kernels, drivers, and tools and virtual machine hypervisors but it's not necessarily going to help developers writing generic applications. Think of Juno as the development platform for the low level designers and coders, not those that are migrating Facebook or Flappy Bird to your next smartphone.
The Juno platform helps ARM in a couple of specific ways. From a software perspective, it creates common foundation for the ARMv8 ecosystem and allows developer access to silicon before ARM's partners have prepared their own platforms. ARM claims that Juno is a fairly "neutral" platform so software developers won't feel like they are being funneled in one direction. I'd be curious what ARM's partners actually think about that though with the inclusion of Mali graphics, a product that ARM is definitely trying to promote in a competitive market.
Though the primary focus might be software, hardware partners will be able to benefit from Juno. On this board they will find the entire ARMv8 IP portfolio tested up to modern silicon. This should enable hardware vendors to see A57 and A53 working, in action and with the added benefit of a full big.LITTLE implementation. The hope is that this will dramatically accelerate the time to market for future 64-bit ARM designs.
The diagram above shows the full break down of the Juno SoC as well as some of the external connectivity on the board itself. The memory system is built around 8GB of DDR3 running at 12.8 GB/s and the is extensible through the PCI Express slots and the FPGA options.
Of course hardware is only half the story - today Linaro is releasing a 64-bit port of the Android Open Source Project (AOSP) that will run on Juno. That, along with the Linux kernel v3.14 with ARMv8-A support should give developers the tools needed to write the applications, middleware and kernels for future hardware. Also worth noting on June 25th at Google I/O was the announcement of developer access coming for Android L. This build will support ARMv8-A as well.
The switch to 64-bit technology on ARM devices isn't going to happen overnight but ARM and its partners have put together a collective ecosystem that will allow the software and hardware developers to make transition as quick and, most importantly, as painless as possible. With outside pressure pushing on ARM and its low power processor designs, it is taking more of its fate in its own hands, pushing the 64-bit transition forward at an accelerated pace. This helps ARM in the mobile space, the consumer space as well as the enterprise markets, a key market for SoC growth.
Subject: Mobile | June 29, 2014 - 06:39 PM | Jeremy Hellstrom
Tagged: Republic of Gamers, ips display, i7-4710HQ, gtx 850m, G550JK, asus, 15.6 inch
Fremont, CA (June 26, 2014) - ASUS Republic of Gamers (ROG) today announces the G550JK gaming notebook, a compact powerhouse with acrisp 15.6-inch display that offers all the benefits associated with the award-winning ROG notebook range in an even more portable form factor. Powered by the latest 4th-generation Intel Core i7 processors, and featuring an NVIDIA GeForce GTX 850M GPU that can be overclocked thanks to ASUS TurboMaster technology, the G550JK provides the ultimate gaming-on-the-go experience. In a collaborative effort, the G550JK was used to scale Mt. Elbert, the highest peak of the Rocky Mountains, in an attempt to set the world record for highest elevation LAN party.
Powerful, stylish, and feature-heavy
The G550JK may be compact, slim and portable, but it packs a big punch. ASUS TurboMaster technology, along with dual fans and copper heatsinks, allow the NVIDIA GeForce GTX 850M GPU to be safely overclocked by up to 5%. Meanwhile, Optimus support maximizes battery life when not running GPU-demanding applications. .
The sleek low-profile aluminum lines of G550JK are enhanced by the signature ROG color scheme of black with fine red diamond-cut detailing. The red backlight of the seamless one-piece hiclet keyboard makes it easy on the eyes when gaming in darkened environments while the subtly illuminated ROG logo on the lid adds a touch of exclusivity and makes sure opponents know exactly what they’re up against. Measuring just 1.1-inch at the thickest point, G550JK can go anywhere, and win everywhere.
The G550JK’s packs a 15.6-inch Full HD IPS LED-backlit display provides a stunning visual experience with its wide 178-degree viewing angles and anti-glare coating for comfort during long gaming sessions. The included 802.11ac WiFi ensures ultra-fast ping times and transfer rates to deliver the best wireless gaming when paired with an 802.11ac router. ASUS SonicMaster Premium, incorporating ICEpower, Bang & Olufsen technology and an external SonicMaster subwoofer, gives the G550JK powerful high-fidelity audio and added bass for a more immersive gaming experience.
The G550JK on a quest for a world record
HighLANder, an event hosted by Linus Media Group and Tek Syndicate on June 23rd, 2014 in Leadville Colorado, used the ROG G550JK gaming notebook to scale Mt. Elbert, the highest peak of the Rocky Mountains at 14,440 ft., followed by wirelessly connecting via an ASUS RT-AC68U router at the summit in order to set a world record for highest elevation LAN party. The elevation was certified by expert witness Elizabeth Thompson, a PHD student in Atmospheric Meteorology, and has been submitted to the Guinness Book of World Records for review and certification. Ten G550JK notebooks were used to accomplish the feat and along with 13 participants from LinusTechTips, Tek Syndicate, Newegg TV, and ASUS.
Subject: Processors, Mobile | June 23, 2014 - 01:08 PM | Ryan Shrout
Tagged: snapdragon, qualcomm, gaming, Android, adreno
Today Qualcomm has published a 22-page white paper that keys in on the company's focus around Android gaming and the benefits that Qualcomm SoCs offer. As the dominant SoC vendor in the Android ecosystem of smartphones, tablets and handhelds (shipping more than 32% in Q2 of 2013) QC is able to offer a unique combination of solutions to both developers and gamers that push Android gaming into higher fidelity with more robust game play.
According to the white paper, Android gaming is the fastest growing segment of the gaming market with a 30% compound annual growth rate from 2013 to 2015, as projected by Gartner. Experiences for mobile games have drastically improved since Android was released in 2008 with developers like Epic Games and the Unreal Engine pushing visuals to near-console and near-PC qualities.
Qualcomm is taking a heterogeneous approach to address the requirements of gaming that include AI execution, physics simulation, animation, low latency input and high speed network connectivity in addition to high quality graphics and 3D rendering. Though not directly a part of the HSA standards still in development, the many specialized engines that Qualcomm has developed for its Snapdragon SoC processors including traditional CPUs, GPUs, DSPs, security and connectivity allow the company to create a solution that is built for Android gaming dominance.
In the white paper Qualcomm dives into the advantages that the Krait CPU architecture offers for CPU-based tasks as well as the power of the Adreno 4x series of GPUs that offer both raw performance and the flexibility to support current and future gaming APIs. All of this is done with single-digit wattage draw and a passive, fanless design and points to the huge undertaking that mobile gaming requires from an engineering and implementation perspective.
For developers, the ability to target Snapdragon architectures with a single code path that can address a scalable product stack allows for the least amount of development time and the most return on investment possible. Qualcomm continues to support the development community with tools and assistance to bring out the peak performance of Krait and Adreno to get games running on lower power parts as well as the latest and upcoming generations of SoCs in flagship devices.
It is great to see Qualcomm focus on this aspect of the mobile market and the challenges presented by it require strong dedication from these engineering teams. Being able to create compelling gaming experiences with high quality imagery while maintaining the required power envelope is a task that many other company's have struggled with.
Check out the new landing page over at Qualcomm if you are interested in more technical information as well as direct access to the white paper detailing the work Qualcomm is putting into its Snapdragon line of SoC for gamers.
Subject: General Tech, Mobile | June 18, 2014 - 01:57 PM | Jeremy Hellstrom
Tagged: Transformer, tablet, laptop, Chromebook, apple
If you are overwhelmed by the choice of mobile products on the market and are looking for a little guidance this article at The Tech Report is a good resource. Their staff have picked out what they feel are the best mobile devices from tablets to transformer pads to full sized laptops. You can choose between several models in each category depending on your budget, as the best solutions tend to be the most expensive. The budget models are nothing to sneer at though as even on the low end mobile devices pack a lot more power than they used to.
"Earlier this year, we revised the structure of the TR System Guide to focus exclusively on PC components. Our aim was to cover peripherals and mobile gear in separate articles. We posted our first standalone peripheral picks in April, and today, we're completing the set with our first standalone mobile staff picks."
Here is some more Tech News from around the web:
- HP Machine: Memristor pioneer explains his discovery @ The Inquirer
- One in five SMBs refuse to let go of Windows XP @ The Inquirer
- Blackberry 10 to finally get Netflix app thanks to Amazon Appstore deal @ The Inquirer
- How to Control a Servo Motor from a BeagleBone Black on Linux @ Linux.com
- Unisys cozies closer to Intel, 'sunsets' proprietary processor @ The Register
- People will happily run malware if paid ONE CENT – new study @ The Register
Subject: General Tech, Storage, Mobile | June 16, 2014 - 01:54 AM | Scott Michaud
CFast is a standard, based on the merging of CompactFlash with SATA, for memory cards to have SSD-like performance. It has been around for a while, CFast 2.0 having been released in Q4 2012, but with very limited adoption. You could count the number of camera models which use it on a single hand. Still, ADATA is entering that market with a lineup of memory cards, with quite a bit of variety.
The ADATA ISC3E will come in SLC (one stored bit per memory cell) and MLC (two stored bits per memory cell) models. Capacities will range from 4GB to 64GB (SLC) or 4GB to 128GB (MLC). Speeds are fairly low, compared to modern SSDs. SLC is rated at 165 MB/s read and 170 MB/s write, while MLC can read at 435 MB/s and write at 120 MB/s. They support ECC and S.M.A.R.T.
Of course, this is kind-of interesting in terms of its small, removable form factor. Beyond that, it seems to be a few years back in terms of SSD technology. For the high resolution (or high frame rate) camera use case, read and write speeds really do not matter, except when you transfer your media off of your device (which the MLC version is clearly better suited for). Otherwise, as long as your write speed is consistently above what the camera can output, going bigger will be wasted overhead. ADATA suggests using these CFast 2.0 cards in POS terminals and kiosks but, at that point, would you really need small and removable memory?
ADATA has not released pricing and availability.
Subject: General Tech, Mobile, Shows and Expos | June 15, 2014 - 01:51 AM | Scott Michaud
Tagged: x86, SteamOS, Steam Machine, Steam Controller, steam, mobile, handheld, E3 14, E3
To be doubly clear, if the title was not explicit enough, this announcement is not made by Valve. This company is called, "SteamBoy Machine team". If not a hoax, this is one of the many Steam Machines which are expected to come out of the SteamOS initiative. Rather than taking the platform to a desktop or home theater PC (HTPC) form-factor, this company wants to target the handheld PC gaming market.
If it comes out, that is a clever use of SteamOS. I can see Big Picture Mode being just as useful on a small screen as it is on a TV, especially with its large font and controller navigation. The teasers suggest that it will use the haptic feedback-based touchpads which Valve are expected to base the Steam Controller on. It will also include a 5-inch touchscreen.
The Escapist got into contact with the team and received a few more specs:
- Quad-Core CPU (x86)
- 4GB RAM
- 32GB built-in storage
Even if this company does not make good on their expectations, companies will now be considering portable SteamOS devices. This is the sort of outside-the-box thinking that Valve was pushing for when they wanted to create an open platform. Each party will struggle to win in their personal goals, yet they can also rely on the crowd (other companies or individuals) to keep up in areas where they do not want an edge.
Philosophy aside, the company is targeting 2015 with a "Standard Edition" supporting WiFi and 3G. It would make sense to have a WiFi-only model, but who knows.
Subject: Mobile | June 13, 2014 - 03:48 PM | Jeremy Hellstrom
Tagged: thermaltake, Massive TM, laptop cooler
The Thermaltake Massive TM is more than just a laptop cooler with a pair of 120mm fans to keep your temperatures in line, it can also track the temperature of your laptop as well. The cooler is USB powered but does offer USB pass through so you do not end up down one plug when you are using the Massive TM. HiTech Legion's testing showed an average drop in temperature of around 4C, if that is worth $40 to you then pick one up.
"The Thermaltake Massive TM is a 17” laptop and notebook cooler that comes with a little extra. The Massive TM by Thermaltake uses 4 temperature sensors that can each be repositioned to track temps on different parts of your laptop."
Here are some more Mobile articles from around the web:
- Fujitsu Lifebook T904 @ The Inquirer
- Nokia Lumia 630 @ The Inquirer
- iOCEAN X8 octa-core Smartphone Review @ Madshrimps
- Moto E @ The Inquirer
- LG G3 @ The Inquirer
Subject: General Tech, Mobile, Shows and Expos | June 9, 2014 - 02:10 PM | Scott Michaud
Tagged: shield tablet, shield, nvidia, E3 14, E3
The Tech Report had their screenshot-fu tested today with the brief lifespan of NVIDIA's SHIELD Tablet product page. As you can see, it is fairly empty. We know that it will have at least one bullet point of "Features" and that its name will be "SHIELD Tablet".
Image Credit: The Tech Report
Of course, being the first day of E3, it is easy to expect that such a device will be announced in the next couple of days. This is expected to be based on the Tegra K1 with 2GB of RAM and have a 2048x1536 touch display.
It does question what exactly is a "SHIELD", however. Apart from being a first-party device, how would they be any different from other TegraZone devices? We know that Half Life 2 and Portal have been ported to the SHIELD product line, exclusively, and will not be available on other Tegra-powered devices. Now that the SHIELD line is extending to tablets, I wonder how NVIDIA will handle this seemingly two-tier class of products (SHIELD vs Tegra OEM devices). It might even depend on how many design wins they achieve, along with their overall mobile market share.
Subject: General Tech, Mobile | June 5, 2014 - 02:51 PM | Scott Michaud
Tagged: tegra k1, tegra, project tango, nvidia, google, Android
Today, Google announced their "Project Tango" developer kit for tablets with spatial awareness. With a price tag of $1,024 USD, it is definitely aimed at developers. In fact, the form to be notified about the development kit has a required check box that is labeled, "I am a developer". Slightly above the form is another statement, "These development kits are not a consumer device and will be available in limited quantities".
So yes, you can only buy these if you are a developer.
The technology is the unique part. Project Tango is aimed at developers to make apps which understand the 3D world around the tablet. Two examples categories they have already experimented with are robotics and computer vision. Of course, this could also translate to alternate reality games and mapping.
While Google has not been too friendly with OpenCL in its Android platform, it makes sense that they would choose a flexible GPU with a wide (and deep) range of API support. While other SoCs are probably capable enough, the Kepler architecture in the Tegra K1 is about as feature-complete as you can get in a mobile chip, because it is basically a desktop chip.
Google's Project Tango is available to developers, exclusively, for $1,024 and ships later this month.
Also, that price is clearly a pun.
Subject: Processors, Mobile | June 4, 2014 - 11:00 AM | Ryan Shrout
Tagged: computex, computex 2014, arm, cavium, thunderx
While much of the news coming from Computex was centered around PC hardware, many of ARMs partners are making waves as well. Take Cavium for example, introducing the ThunderX CN88XX family of processors. With a completely custom ARMv8 architectural core design, the ThunderX processors will range from 24 to 48 cores and are targeted at large volume servers and cloud infrastructure. 48 cores!
The ThunderX family will be the first SoC to scale up to 48 cores and with a clock speed of 2.5 GHz and 16MB of L2 cache, should offer some truly impressive performance levels. Cavium claims to be the first socket-coherent ARM processor as well, using the Cavium Coherent Processor Interconnect. The I/O capacity stretches into the hundreds of Gigabits and quad channel DDR3 and DDR4 memory speeds up to 2.4 GHz keep the processors fed with work.
Here is the breakdown on the ThunderX families.
ThunderX_CP: Up to 48 highly efficient cores along with integrated virtSOC, dual socket coherency, multiple 10/40 GbE and high memory bandwidth. This family is optimized for private and public cloud web servers, content delivery, web caching, search and social media workloads.
ThunderX_ST: Up to 48 highly efficient cores along with integrated virtSOC, multiple SATAv3 controllers, 10/40 GbE & PCIe Gen3 ports, high memory bandwidth, dual socket coherency, and scalable fabric for east-west as well as north-south traffic connectivity. This family includes hardware accelerators for data protection/ integrity/security, user to user efficient data movement (RoCE) and compressed storage. This family is optimized for Hadoop, block & object storage, distributed file storage and hot/warm/cold storage type workloads.
ThunderX_SC: Up to 48 highly efficient cores along with integrated virtSOC, 10/40 GbE connectivity, multiple PCIe Gen3 ports, high memory bandwidth, dual socket coherency, and scalable fabric for east-west as well as north-south traffic connectivity. The hardware accelerators include Cavium’s industry leading, 4th generation NITROX and TurboDPI technology with acceleration for IPSec, SSL, Anti-virus, Anti-malware, firewall and DPI. This family is optimized for Secure Web front-end, security appliances and Cloud RAN type workloads.
ThunderX_NT: Up to 48 highly efficient cores along with integrated virtSOC, 10/40/100 GbE connectivity, multiple PCIe Gen3 ports, high memory bandwidth, dual socket coherency, and scalable fabric with feature rich capabilities for bandwidth provisioning , QoS, traffic Shaping and tunnel termination. The hardware accelerators include high packet throughput processing, network virtualization and data monitoring. This family is optimized for media servers, scale-out embedded applications and NFV type workloads.
We spoke with ARM earlier this year about its push into the server market and it is partnerships like these that will begin the ramp up to wide spread adoption of ARM-based server infrastructure. The ThunderX family will begin sampling in early Q4 2014 and production should be available by early 2015.
Subject: General Tech, Displays, Mobile | June 3, 2014 - 07:54 PM | Jeremy Hellstrom
Tagged: vesa, dockport, DisplayPort, amd
Remember DockPort? The three in one connection we have discussed in the past? The Thunderbolt-ish connection for devices with DisplayPort which allows transmission of audio and video plus USB data and power all on one connector. It's here! (even if the devices aren't quite common yet)
NEWARK, CA (3 June 2014) The Video Electronics Standards Association (VESA) today announced the release of the DockPort standard. Developed by several VESA member companies, DockPort is an optional extension of the DisplayPort standard that will allow USB 3.1 data and DC power for battery charging to be carried over a single DisplayPort connector and cable that also carries high-resolution audio/video (A/V) data.
This new extension of the DisplayPort standard is fully backward compatible with all existing DisplayPort devices. When a DockPort-enabled DisplayPort source such as a computer or tablet is connected with a DockPort-enabled DisplayPort sink such as a display monitor or docking station A/V plus USB data and power will be transferred over a common cable through a single connector. If either the source or sink device is not a DockPort-enabled, then source and sink will recognize only the DisplayPort A/V data stream.
As computing platforms become increasingly mobile, it becomes necessary to reduce the number of external connectors, explained Steve Belt, Corporate Vice President - Strategic Alliances & Solutions Enablement AMD, a VESA member company. With DockPort, VESA has developed a technology standard that enhances elegant docking designs, reduces mobile form factors, and enriches the user experience with streamlined, one-cable access to a wide range of external displays, peripherals and storage.
DockPort is the first royalty-free industry standard that combines these three essential interface functions into a single connector. VESA first revealed its intention to develop this standard at the 2014 International Consumer Electrics Show. It anticipates that several vendors will demonstrate DockPort-enabled DisplayPort systems at Computex Taiwan, which begins today.
Until today, most mobile computing platforms required three separate interfaces to support power charging, data transmission and external video, said Chris Griffith, Business Development Manager for Consumer and Computing Interface at Texas Instruments, a VESA member company. With DockPort, VESA has elegantly merged this ungainly tangle of wires into a single, sleek connector, combining power charging with the industrys most popular data transportUSBand the industrys highest-speed A/V transportDisplayPort. DockPort can reduce system implementation cost as designers can reduce external connectors and simplify docking implementations.
VESA is developing a compliance test protocol to certify systems that meet the DockPort standard. Systems that satisfy this test protocol will be permitted to display VESAs new DockPort logo on their packaging as a guide for consumers seeking this capability.
The new DockPort standard demonstrates the enormous adaptability of the DisplayPort standard, according to VESA Board Chair Alan Kobayashi, Fellow & Executive R&D Management for DisplayPort Group at MegaChips Technology America. On the one hand, DisplayPort is a flexible A/V transport protocol that easily coexists with other protocols, like USBit plays nicely with others. On the other hand, DisplayPort is also a robust and proven connector design whose electro-mechanical properties can accommodate data and power over a common passive copper cable and interface.
Subject: Mobile | June 2, 2014 - 11:46 PM | Sebastian Peak
Tagged: UHD, M.2, gaming laptop, core i7, computex 2014, computex, ASUS ROG, asus, 4k, 15.6 inch
The GX500 is ASUS’s new ultrabook-thin 15.6" gaming laptop from the ROG series, and it features a very impressive 4K screen.
This...isn't your average gaming laptop
Just 0.75” thick (but weighing a robust 4.85lbs - though not bad for a 15.6" gaming machine) the GX500 has some very impressive specs. Running up to an Intel Core i7 processor, NVIDIA GeForce GTX 860M graphics, and what sounds like an awesome UHD 3840 x 2160-pixel display with ASUS “VisualMaster technology” for a claimed 100% NTSC wide color gamut, which is a world-first on a notebook according to ASUS.
The GX500 also includes a M.2 SSD running on a full PCIe x4 connection, and features a dual-fan cooling system to keep thermals in check in what ASUS says is the worlds thinnest 15” gaming notebook.
ASUS has not announced pricing, but states that it will be dependent upon configuration. The ASUS ROG GX500 will be available in Q3 2014.
For more Computex 2014 coverage, please check out our feed!
Subject: Processors | May 28, 2014 - 05:09 PM | Sebastian Peak
Tagged: tablet, SoC, Rockchip, mobile, Intel, atom, arm, Android
While details about upcoming Haswell-E processors were reportedly leaking out, an official announcement from Intel was made on Tuesday about another CPU product - and this one isn't a high-end desktop part. The chip giant is partnering with the fabless semiconductor manufacturer Rockchip to create a low-cost SoC for Android devices under the Intel name, reportedly fabricated at TSMC.
We saw almost exactly the opposite of this arrangement last October, when it was announced that Altera would be using Intel to fab ARMv8 chips. Try to digest this: Instead of Intel agreeing to manufacture another company's chip with ARM's architecture in their fabs, they are going through what is said to be China's #1 tablet SoC manufacturer to produce x86 chips...at TSMC? It's a small - no, a strange world we live in!
From Intel's press release: "Under the terms of the agreement, the two companies will deliver an Intel-branded mobile SoC platform. The quad-core platform will be based on an Intel® Atom™ processor core integrated with Intel's 3G modem technology."
As this upcoming x86 SoC is aimed at entry-level Android tablets this announcement might not seem to be exciting news at first glance, but it fills a short term need for Intel in their quest for market penetration in the ultramobile space dominated by ARM-based SoCs. The likes of Qualcomm, Apple, Samsung, TI, and others (including Rockchip's RK series) currently account for 90% of the market, all using ARM.
As previously noted, this partnership is very interesting from an industry standpoint, as Intel is sharing their Atom IP with Rockchip to make this happen. Though if you think back, the move is isn't unprecedented... I recall something about a little company called Advanced Micro Devices that produced x86 chips for Intel in the past, and everything seemed to work out OK there...
When might we expect these new products in the Intel chip lineup codenamed SoFIA? Intel states "the dual-core 3G version (is) expected to ship in the fourth quarter of this year, the quad-core 3G version...expected to ship in the first half of 2015, and the LTE version, also due in the first half of next year." And again, this SoC will only be available in low-cost Android tablets under this partnership (though we might speculate on, say, an x86 SoC powered Surface or Ultrabook in the future?).
Subject: General Tech, Mobile | May 27, 2014 - 05:22 PM | Scott Michaud
Tagged: tablet, HP 7 Plus, hp, cheap tablet, cheap computer
Years ago, HP purchased Palm with the intention of producing tablets based on WebOS. After a very short time on the market, the company pulled the plug and liquidated their stock for $99. These tablets, of course, sold instantly. Now, HP has developed an Android tablet which actually intends to be sold at that $99 price point.
Called the HP 7 Plus, this tablet has a quad-core SoC from Allwinner Technology, based on the low-power ARM Cortex A7 architecture. This is the architecture that you often see paired with Cortex A15 cores in their "big.LITTLE" arrangement. Complementing this processor is 1GB of RAM, 8GB of internal storage, a microSD slot, 640x480 front-facing and 2MP rear-facing cameras, and about five (5) hours of battery life. It is capable of Miracast over WiFi, which is an impressive feature for its price.
The operating system is Android 4.2.2, Jelly Bean. While this is not the most recent distribution of Android, it should definitely serve users looking for an under-$100 tablet. Seriously, this space is huge and often a crap shoot in terms of reliability. If HP released a decent device, it could be a winner.
The HP 7 Plus is apparently available now, but out of stock, for $99.99. I do not know whether they already released and sold out immediately, or if it is still waiting on its first shipment.
Subject: Mobile | May 24, 2014 - 11:47 PM | Tim Verry
Tagged: Windows 8.1, thinkpad 10, Lenovo, ips display, Intel, Bay Trail
Lenovo made the previously-rumored ThinkPad 10 tablet official earlier this month. The business-friendly tablet starts at $599 and will be available in a couple of weeks. Lenovo has packed in quite a bit of hardware into a 10-inch aluminum chassis to create a device capable of up to 10 hours of battery life (productivity not guaranteed).
The official ThinkPad 10 specifications closely match the previous rumors, but we do know a few more finer details. In particular, Lenovo has gone with an aluminum shell hosting a 10.1” 1920x1200 IPS display with 10-point multi-touch (and Gorilla Glass technology), two cameras (2MP webcam and 8MP rear camera), an optional digitizer pen, and a number of docking options.
Fans of handwriting recognition will be pleased with the confirmation of a digitizer while typists will be able to pair the 10-inch tablet with a keyboard dock. Lenovo is also offering a Quickshot cover accessory which is a soft screen cover/case that has a corner that can be easily folded to reveal the camera (and performing this action automatically opens up the camera app).
The tablet dock (which doubles as a charger) is a docking station that adds two USB 3.0 ports, one HDMI port, and one Ethernet port. On the other hand, the keyboard dock has an angled slot for the ThinkPad 10 to sit in (there is no angled hinge here) and features a physical keyboard and small trackpad.
Finally, if you are more into the Microsoft Surface-style touch keyboard, Lenovo offers a case with an included touch-sensitive keyboard (keys with no physical actuation).
Internally, the ThinkPad 10 uses a Bay Trail Atom Z3795 SoC, either 2GB or 4GB of RAM, and up to 128GB of (eMMC 4.5.1) internal storage. Internal radios include 802.11n, Bluetooth, and cellular (3G and 4G LTE). The tablet itself has a micro HDMI video output, micro SD card slot for storage, and a single USB 2.0 port.
All decked out, you are looking at an aluminum-clad tablet weighing less than 1.4 pounds running the full version of Windows 8.1 that starts at $599 for the tablet itself. The four optional accessories (the docks and cases) will cost extra (see below). Note that the touch-sensitive keyboard case and a ruggedized case will be made available later this summer following the June launch of the tablet and other options.
The $599 price ($728 with keyboard) may scare away consumers wanting an entertainment device, but business users and content creators with frequent travel needs (see our own Ryan Shrout) will appreciate the niche features, battery life, and build quality.
For those curious, the accessory costs will break down as follows:
- Ultrabook Keyboard: $129
- Tablet Dock: $119
- Quickshot cover: $59
- Rugged Case: $69 (available later this summer)
- Touch Case: $119 (available later this summer)
Subject: General Tech, Graphics Cards, Mobile | May 22, 2014 - 04:58 PM | Scott Michaud
Tagged: tegra k1, nvidia, iris pro, iris, Intel, hd 4000
The Chinese tech site, Evolife, acquired a few benchmarks for the Tegra K1. We do not know exactly where they got the system from, but we know that it has 4GB of RAM and 12 GB of storage. Of course, this is the version with four ARM Cortex-A15 cores (not the upcoming, 64-bit version based on Project Denver). On 3DMark Ice Storm Unlimited, it was capable of 25737 points, full system.
Image Credit: Evolife.cn
You might remember that our tests with an Intel Core i5-3317U (Ivy Bridge), back in September, achieved a score of 25630 on 3DMark Ice Storm. Of course, that was using the built-in Intel HD 4000 graphics, not a discrete solution, but it still kept up for gaming. This makes sense, though. Intel HD 4000 (GT2) graphics has a theoretical performance of 332.8 GFLOPs, while the Tegra K1 is rated at 364.8 GFLOPs. Earlier, we said that its theoretical performance is roughly on par with the GeForce 9600 GT, although the Tegra K1 supports newer APIs.
Of course, Intel has released better solutions with Haswell. Benchmarks show that Iris Pro is able to play Battlefield 4 on High settings, at 720p, with about 30FPS. The HD 4000 only gets about 12 FPS with the same configuration (and ~30 FPS on Low). This is not to compare Intel to NVIDIA's mobile part, but rather compare Tegra K1 to modern, mainstream laptops and desktops. It is getting fairly close, especially with the first wave of K1 tablets entering at the mid-$200 USD MSRP in China.
As a final note...
There was a time where Tim Sweeney, CEO of Epic Games, said that the difference between high-end and low-end PCs "is something like 100x". Scaling a single game between the two performance tiers would be next-to impossible. He noted that ten years earlier, that factor was more "10x".
Now, an original GeForce Titan is about 12x faster than the Tegra K1 and they support the same feature set. In other words, it is easier to develop a game for the PC and high-end tablet than it was to develop an PC game for high-end and low-end machines, back in 2008. PC Gaming is, once again, getting healthier.
Subject: General Tech, Mobile | May 17, 2014 - 04:07 AM | Scott Michaud
Tagged: Lawsuit, google, apple
If we all could just get along and get back to work...
On Friday, May 16th, Apple and Google (including the remains of its Motorola Mobility division) released a joint statement marking the end of all patent litigation between the two companies. The two companies have been in legal warfare for three-and-a-half years, now. The two companies will also "work together in some areas of patent reform". It is unclear what that actually means.
This decision does not seem to affect Apple's ongoing litigation with Samsung. Those two companies are still in a famous and fierce skirmish over mankind's greatest UX innovations, like slide-to-unlock and the little bounce that happens when you scroll to the end of a list too fast. Those are, honestly, the issues that we are facing. I have a suggestion for an area to reform...
... but that has been beaten to death for years, now. It, at least, shows a willingness to cooperate going forward. It also shows a slight bit more promise for products like Ubuntu on phones, Firefox OS, and even smaller initiatives. You can say what you like about the current litigation, but closing the road for independent developers with great and innovative ideas is terrible and bad for society. Unique smartphones could be made, each with slide-to-unlock, just like unique OSes can use icons and web browsers can use tabs.
Get notified when we go live!