Subject: General Tech, Displays, Mobile | July 15, 2014 - 05:39 PM | Jeremy Hellstrom
Tagged: displaylink, club 3d, 4k
Why would you want a USB 3.0 4K display adapter you might ask? Perhaps you have an ultrabook with limited display outputs that do not output in 4K resolution but somehow you managed to get your hands on a 4K display for work or leisure and have a need for the full resolution. Club 3D now has a family of USB adapters for you, the CSV-2302 USB 3.0 to DisplayPort 4K, CSV-2301 USB 3.0 to DisplayPort 1600p and the CSV-2300D USB 3.0 to DVI-I graphics adapters. This is the first implementation of the DisplayLink DL-5500 chipset and it does indeed support 10bit colour if your display can handle it.
The MSRP for this device when it starts to ship in about 2 weeks will be ~$142.
Club 3D officially launches the next generation of USB 3.0 Graphics adapters capable of outputting high resolutions to DVI-I (2048x 1152p), DisplayPort (2560x 1600p) and the world’s first USB 3.0 to DisplayPort Graphics (CSV-2302) adapter which supports 4K or Ultra High Definition resolution at 3840x 2160p.
The Universal Serial Bus (USB) Port of a desktop computer or notebook is multifunctional and can be used to connect a large variety of (storage) devices, keyboards, mice and and other peripherals like monitors. Back in 2011, Club 3D introduced its first SenseVision USB Graphics adapters. These small external graphics adapters can be used to connect a DVI or HDMI monitor to the USB 2.0 output of a Desktop Computer or Notebook and create a multi screen setup.
The SenseVision USB adapters proved to be very successful across the globe! Not only with travelers but also in (semi) professional environments where more monitors mean more productivity.
The new Club 3D USB 3.0 Graphics adapters are fully ‘Plug and Display’ certified and the USB 3.0 to 4K Graphics Adapter (CSV-2302) is the very first to use the brand new DisplayLink DL-5500 chipset enabling 4K Ultra High Definition output to DisplayPort enabled 4K monitors at 30Hz. The Club 3D USB 3.0 to 4K Graphics Adapter (CSV-2302) is the first device available worldwide with the revolutionary new DisplayLink SoC implemented.
This Graphics adapter uses little resources of your system so it won’t affect performance ensuring at the same time a great image quality. It’s the ideal solution for anyone wanting to expand desktop space in order to use multiple programs simultaneously.
- 3840x2160 output at 30Hz
- Backwards compatible with QHD and HD monitors
- DP 1.2 interface (DisplayPort)
- HDCP 2.0 for protected video playback
- Integrated DisplayPort Audio
Subject: Mobile | July 14, 2014 - 03:46 PM | Jeremy Hellstrom
Tagged: memo pad 7, memopad, asus, Android 4.4.2
The ASUS MeMO Pad 7 has a 7" 1280x800 IPS display, a BayTrail Atom Z3745 Quad-Core that can run up to 1.86GHz, 1GB of RAM and 16GB of internal storage with support for SD cards up to 32GB. All in all this seems like the stats you would expect from a $150 tablet, but the challenge is to be usable enough to not be returned. Legit Reviews tested out this tablet and were impressed by the graphics performance of the new Atom but were disappointed by the WiFi speeds which were significantly slower than their preferred tablet, the ~$200 Nexus 7.
"Budget friendly Android tablets are a dime a dozen these days, but they all aren’t created equally and there are some very bad tablets out there. When you get into the sub $150 tablet market you need to be very careful with what tablet you go with as companies start cutting costs by reducing the hardware specifications and that can lead to subpar performance and an overall bad user experience. If you’ve ever purchased an inexpensive tablet thinking that they were all the same, you usually find out in under three minutes that you screwed up and will be running to return it."
Here are some more Mobile articles from around the web:
- Adata PV100 4200mAh USB Battery @ eTeknix
- Kingston Mobilelite Wireless G2 @ Hardware Asylum
- Thermaltake Massive Notebook Coolers (V20, SP, TM) Review @ OCC
- Steelseries Stratus Bluetooth iOS Mobile Gaming Controller @ eeTeknix
- Motorola Moto G Smartphone Review @ Hardware Secrets
Subject: General Tech, Processors, Mobile | July 11, 2014 - 04:58 PM | Scott Michaud
Tagged: x86, VIA, isaiah II, Intel, centaur, arm, amd
There might be a third, x86-compatible processor manufacturer who is looking at the mobile market. Intel has been trying to make headway, including the direct development of Android for the x86 architecture. The company also has a few design wins, mostly with Windows 8.1-based tablets but also the occasional Android-based models. Google is rumored to be preparing the "Nexus 8" tablet with one of Intel's Moorefield SoCs. AMD, the second-largest x86 processor manufacturer, is aiming their Mullins platform at tablets and two-in-ones, but cannot afford to play snowplow, at least not like Intel.
VIA, through their Centaur Technology division, is expected to announce their own x86-based SoC, too. Called Isaiah II, it is rumored to be a quad core, 64-bit processor with a maximum clock rate of 2.0 GHz. Its GPU is currently unknown. VIA sold their stake S3 Graphics to HTC back in 2011, who then became majority shareholder over the GPU company. That said, HTC and VIA are very close companies. The chairwoman of HTC is the founder of VIA Technologies. The current President and CEO of VIA, who has been in that position since 1992, is her husband. I expect that the GPU architecture will be provided by S3, or will somehow be based on their technology. I could be wrong. Both companies will obviously do what they think is best.
It would make sense, though, especially if it benefits HTC with cheap but effective SoCs for Android and "full" Windows (not Windows RT) devices.
Or this announcement could be larger than it would appear. Three years ago, VIA filed for a patent which described a processor that can read both x86 and ARM machine language and translate it into its own, internal microinstructions. The Centaur Isaiah II could reasonably be based on that technology. If so, this processor would be able to support either version of Android. Or, after Intel built up the Android x86 code base, maybe they shelved that initiative (or just got that patent for legal reasons).
But what about Intel? Honestly, I see this being a benefit for the behemoth. Extra x86-based vendors will probably grow the overall market share, compared to ARM, by helping with software support. Even if it is compatible with both ARM and x86, what Intel needs right now is software. They can only write so much of it themselves. It is possible that VIA, being the original netbook processor, could disrupt the PC market with both x86 and ARM compatibility, but I doubt it.
Centaur Technology, the relevant division of VIA, will make their announcement in less than 51 days.
Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | July 7, 2014 - 04:06 AM | Scott Michaud
Tagged: tegra k1, OpenGL ES, opengl, Khronos, google io, google, android extension pack, Android
Sure, this is a little late. Honestly, when I first heard the announcement, I did not see much news in it. The slide from the keynote (below) showed four points: Tesselation, Geometry Shaders, Computer [sic] Shaders, and ASTC Texture Compression. Honestly, I thought tesselation and geometry shaders were part of the OpenGL ES 3.1 spec, like compute shaders. This led to my immediate reaction: "Oh cool. They implemented OpenGL ES 3.1. Nice. Not worth a news post."
Image Credit: Blogogist
Apparently, they were not part of the ES 3.1 spec (although compute shaders are). My mistake. It turns out that Google is cooking their their own vendor-specific extensions. This is quite interesting, as it adds functionality to the API without the developer needing to target a specific GPU vendor (INTEL, NV, ATI, AMD), waiting for approval from the Architecture Review Board (ARB), or using multi-vendor extensions (EXT). In other words, it sounds like developers can target Google's vendor without knowing the actual hardware.
Hiding the GPU vendor from the developer is not the only reason for Google to host their own vendor extension. The added features are mostly from full OpenGL. This makes sense, because it was announced with NVIDIA and their Tegra K1, Kepler-based SoC. Full OpenGL compatibility was NVIDIA's selling point for the K1, due to its heritage as a desktop GPU. But, instead of requiring apps to be programmed with full OpenGL in mind, Google's extension pushes it to OpenGL ES 3.1. If the developer wants to dip their toe into OpenGL, then they could add a few Android Extension Pack features to their existing ES engine.
Epic Games' Unreal Engine 4 "Rivalry" Demo from Google I/O 2014.
The last feature, ASTC Texture Compression, was an interesting one. Apparently the Khronos Group, owners of OpenGL, were looking for a new generation of texture compression technologies. NVIDIA suggested their ZIL technology. ARM and AMD also proposed "Adaptive Scalable Texture Compression". ARM and AMD won, although the Khronos Group stated that the collaboration between ARM and NVIDIA made both proposals better than either in isolation.
Android Extension Pack is set to launch with "Android L". The next release of Android is not currently associated with a snack food. If I was their marketer, I would block out the next three versions as 5.x, and name them (L)emon, then (M)eringue, and finally (P)ie.
Would I do anything with the two skipped letters before pie? (N)(O).
When Magma Freezes Over...
Intel confirms that they have approached AMD about access to their Mantle API. The discussion, despite being clearly labeled as "an experiment" by an Intel spokesperson, was initiated by them -- not AMD. According to AMD's Gaming Scientist, Richard Huddy, via PCWorld, AMD's response was, "Give us a month or two" and "we'll go into the 1.0 phase sometime this year" which only has about five months left in it. When the API reaches 1.0, anyone who wants to participate (including hardware vendors) will be granted access.
AMD inside Intel Inside???
I do wonder why Intel would care, though. Intel has the fastest per-thread processors, and their GPUs are not known to be workhorses that are held back by API call bottlenecks, either. Of course, that is not to say that I cannot see any reason, however...
Subject: Mobile | July 3, 2014 - 02:54 PM | Jeremy Hellstrom
Tagged: kingston, MobileLite Wireless G2
The Kingston MobileLite Wireless G2 is hard to describe quickly, you can plug memory cards or USB flash drives into it and access them with a wireless device, you can plug in an ethernet cord and use it as a wireless router and you can plug USB devices into it to recharge them. Often these all in one devices tend towards being able to do several things poorly as opposed to one thing very well but in this case it seems Kingston has pulled it off. Techgage was not terribly impressed with the features of the software but the utilitarian nature of the interface does keep things simple.
"There are mobile media readers, and then there’s Kingston’s MobileLite Wireless G2. When not serving files over Wi-Fi, it can accept a wired LAN connection to become a travel router, and it can also use its huge battery to help charge your mobile phone while you’re on-the-go. Who doesn’t love a device that can act as a jack-of-all-trades?"
Here are some more Mobile articles from around the web:
- MSI GP70-2PE ‘Leopard’ Gaming Notebook @ eTeknix
- MSI GS60 2PE Ghost Pro @ Kitguru
- MSI GS60-2PE ‘Ghost Pro’ Gaming Notebook @ eTeknix
- PC Specialist Cosmos II @ eTeknix
- MSI GT70-2PE ‘Dominator Pro’ Gaming Notebook @ eTeknix
- Sony Xperia T2 Ultra Dual Smartphone Review @ Hardware Secrets
- Samsung Galaxy Tab Pro 10.1 @ The Inquirer
- LG G3 @ The Inquirer
Subject: Mobile | July 2, 2014 - 12:00 PM | Ryan Shrout
Tagged: linux, linaro, juno, google, armv8-a, ARMv8, arm, android l
Even though Apple has been shipping a 64-bit capable SoC since the release of the A7 part in September of 2013, the Android market has yet to see its first consumer 64-bit SoC release. That is about to change as we progress through the rest of 2014 and ARM is making sure that major software developers have the tools they need to be ready for the architecture shift. That help is will come in the form of the Juno ARM Development Platform (ADP) and 64-bit ready software stack.
Apple's A7 is the first core to implement ARMv8 but companies like Qualcomm, NVIDIA and course ARM have their own cores based on the 64-bit architecture. Much like we saw the with the 64-bit transition in the x86 ecosystem, ARMv8 will improve access to large datasets, will result in gains in performance thanks to increased register sizes, larger virtual address spaces above 4GB and more. ARM also improved performance of NEON (SIMD) and cryptography support while they were in there fixing up the house.
The Juno platform is the first 64-bit development platform to come directly from ARM and combines a host of components to create a reference hardware design for integrators and developers to target moving forward. Featuring a test chip built around Cortex-A57 (dual core), Cortex-A53 (quad core) and Mali-T624 (quad core), Juno allows software to target 64-bit development immediately without waiting for other SoC vendors to have product silicon ready. The hardware configuration implements big.LITTLE, OpenGL ES3.0 support, thermal and power management, Secure OS capability and more. In theory, ARM has built a platform that will be very similar to SoCs built by its partners in the coming months.
ARM isn't quite talking about the specific availability of the Juno platform, but for the target audience ARM should be able to provide the amount of development platforms necessary. Juno enables software development for 64-bit kernels, drivers, and tools and virtual machine hypervisors but it's not necessarily going to help developers writing generic applications. Think of Juno as the development platform for the low level designers and coders, not those that are migrating Facebook or Flappy Bird to your next smartphone.
The Juno platform helps ARM in a couple of specific ways. From a software perspective, it creates common foundation for the ARMv8 ecosystem and allows developer access to silicon before ARM's partners have prepared their own platforms. ARM claims that Juno is a fairly "neutral" platform so software developers won't feel like they are being funneled in one direction. I'd be curious what ARM's partners actually think about that though with the inclusion of Mali graphics, a product that ARM is definitely trying to promote in a competitive market.
Though the primary focus might be software, hardware partners will be able to benefit from Juno. On this board they will find the entire ARMv8 IP portfolio tested up to modern silicon. This should enable hardware vendors to see A57 and A53 working, in action and with the added benefit of a full big.LITTLE implementation. The hope is that this will dramatically accelerate the time to market for future 64-bit ARM designs.
The diagram above shows the full break down of the Juno SoC as well as some of the external connectivity on the board itself. The memory system is built around 8GB of DDR3 running at 12.8 GB/s and the is extensible through the PCI Express slots and the FPGA options.
Of course hardware is only half the story - today Linaro is releasing a 64-bit port of the Android Open Source Project (AOSP) that will run on Juno. That, along with the Linux kernel v3.14 with ARMv8-A support should give developers the tools needed to write the applications, middleware and kernels for future hardware. Also worth noting on June 25th at Google I/O was the announcement of developer access coming for Android L. This build will support ARMv8-A as well.
The switch to 64-bit technology on ARM devices isn't going to happen overnight but ARM and its partners have put together a collective ecosystem that will allow the software and hardware developers to make transition as quick and, most importantly, as painless as possible. With outside pressure pushing on ARM and its low power processor designs, it is taking more of its fate in its own hands, pushing the 64-bit transition forward at an accelerated pace. This helps ARM in the mobile space, the consumer space as well as the enterprise markets, a key market for SoC growth.
Subject: Mobile | June 29, 2014 - 06:39 PM | Jeremy Hellstrom
Tagged: Republic of Gamers, ips display, i7-4710HQ, gtx 850m, G550JK, asus, 15.6 inch
Fremont, CA (June 26, 2014) - ASUS Republic of Gamers (ROG) today announces the G550JK gaming notebook, a compact powerhouse with acrisp 15.6-inch display that offers all the benefits associated with the award-winning ROG notebook range in an even more portable form factor. Powered by the latest 4th-generation Intel Core i7 processors, and featuring an NVIDIA GeForce GTX 850M GPU that can be overclocked thanks to ASUS TurboMaster technology, the G550JK provides the ultimate gaming-on-the-go experience. In a collaborative effort, the G550JK was used to scale Mt. Elbert, the highest peak of the Rocky Mountains, in an attempt to set the world record for highest elevation LAN party.
Powerful, stylish, and feature-heavy
The G550JK may be compact, slim and portable, but it packs a big punch. ASUS TurboMaster technology, along with dual fans and copper heatsinks, allow the NVIDIA GeForce GTX 850M GPU to be safely overclocked by up to 5%. Meanwhile, Optimus support maximizes battery life when not running GPU-demanding applications. .
The sleek low-profile aluminum lines of G550JK are enhanced by the signature ROG color scheme of black with fine red diamond-cut detailing. The red backlight of the seamless one-piece hiclet keyboard makes it easy on the eyes when gaming in darkened environments while the subtly illuminated ROG logo on the lid adds a touch of exclusivity and makes sure opponents know exactly what they’re up against. Measuring just 1.1-inch at the thickest point, G550JK can go anywhere, and win everywhere.
The G550JK’s packs a 15.6-inch Full HD IPS LED-backlit display provides a stunning visual experience with its wide 178-degree viewing angles and anti-glare coating for comfort during long gaming sessions. The included 802.11ac WiFi ensures ultra-fast ping times and transfer rates to deliver the best wireless gaming when paired with an 802.11ac router. ASUS SonicMaster Premium, incorporating ICEpower, Bang & Olufsen technology and an external SonicMaster subwoofer, gives the G550JK powerful high-fidelity audio and added bass for a more immersive gaming experience.
The G550JK on a quest for a world record
HighLANder, an event hosted by Linus Media Group and Tek Syndicate on June 23rd, 2014 in Leadville Colorado, used the ROG G550JK gaming notebook to scale Mt. Elbert, the highest peak of the Rocky Mountains at 14,440 ft., followed by wirelessly connecting via an ASUS RT-AC68U router at the summit in order to set a world record for highest elevation LAN party. The elevation was certified by expert witness Elizabeth Thompson, a PHD student in Atmospheric Meteorology, and has been submitted to the Guinness Book of World Records for review and certification. Ten G550JK notebooks were used to accomplish the feat and along with 13 participants from LinusTechTips, Tek Syndicate, Newegg TV, and ASUS.
Subject: Processors, Mobile | June 23, 2014 - 01:08 PM | Ryan Shrout
Tagged: snapdragon, qualcomm, gaming, Android, adreno
Today Qualcomm has published a 22-page white paper that keys in on the company's focus around Android gaming and the benefits that Qualcomm SoCs offer. As the dominant SoC vendor in the Android ecosystem of smartphones, tablets and handhelds (shipping more than 32% in Q2 of 2013) QC is able to offer a unique combination of solutions to both developers and gamers that push Android gaming into higher fidelity with more robust game play.
According to the white paper, Android gaming is the fastest growing segment of the gaming market with a 30% compound annual growth rate from 2013 to 2015, as projected by Gartner. Experiences for mobile games have drastically improved since Android was released in 2008 with developers like Epic Games and the Unreal Engine pushing visuals to near-console and near-PC qualities.
Qualcomm is taking a heterogeneous approach to address the requirements of gaming that include AI execution, physics simulation, animation, low latency input and high speed network connectivity in addition to high quality graphics and 3D rendering. Though not directly a part of the HSA standards still in development, the many specialized engines that Qualcomm has developed for its Snapdragon SoC processors including traditional CPUs, GPUs, DSPs, security and connectivity allow the company to create a solution that is built for Android gaming dominance.
In the white paper Qualcomm dives into the advantages that the Krait CPU architecture offers for CPU-based tasks as well as the power of the Adreno 4x series of GPUs that offer both raw performance and the flexibility to support current and future gaming APIs. All of this is done with single-digit wattage draw and a passive, fanless design and points to the huge undertaking that mobile gaming requires from an engineering and implementation perspective.
For developers, the ability to target Snapdragon architectures with a single code path that can address a scalable product stack allows for the least amount of development time and the most return on investment possible. Qualcomm continues to support the development community with tools and assistance to bring out the peak performance of Krait and Adreno to get games running on lower power parts as well as the latest and upcoming generations of SoCs in flagship devices.
It is great to see Qualcomm focus on this aspect of the mobile market and the challenges presented by it require strong dedication from these engineering teams. Being able to create compelling gaming experiences with high quality imagery while maintaining the required power envelope is a task that many other company's have struggled with.
Check out the new landing page over at Qualcomm if you are interested in more technical information as well as direct access to the white paper detailing the work Qualcomm is putting into its Snapdragon line of SoC for gamers.
Introduction and Design
It was only last year that we were singing the praises of the GT60, which was one of the fastest notebooks we’d seen to date. Its larger cousin, the GT70, features a 17.3” screen (versus the GT60’s 15.6”), faster CPUs and GPUs, and even better options for storage. Now, the latest iteration of this force to be reckoned with has arrived on our desks, and while its appearance hasn’t changed much, its performance is even better than ever.
While we’ll naturally be spending a good deal of time discussing performance and stability in our article here, we won’t be dedicating much to casing and general design, as—for the most part—it is very similar to that of the GT60. On the other hand, one area on which we’ll be focusing particularly heavily is that of battery life, thanks solely to the presence of NVIDIA’s new Battery Boost technology. As the name suggests, this new feature employs power conservation techniques to extend the notebook’s life while gaming unplugged. This is accomplished primarily via frame rate limiting, which is a feature that has actually been available since the introduction of Kepler, but which until now has been buried within the advanced options available for such products. Battery Boost basically brings this to the forefront and makes it both accessible and default.
Let’s take a look at what this bad boy is packing:
Not much commentary needed here; this table reads like a who’s who of computer specifications. Of particular note are the 32 GB of RAM, the 880M (of course), and the 384 GB SSD RAID array (!!). Elsewhere, it’s mostly business as usual for the ultra-high-end MSI GT notebooks, with a slightly faster CPU than the previous model we reviewed (the i7-4700MQ). One thing is guaranteed: it’s a fast machine.