Subject: General Tech, Mobile | January 11, 2018 - 05:20 PM | Tim Verry
Tagged: snapdragon 845, Samsung, MWC, galaxy s9, galaxy, exynos 9810, CES 2018, CES
Samsung confirmed to ZDNet at CES that it plans to launch its new flagship Galaxy S smartphone at Mobile World Congress next month. The company has managed to keep a tight lid on the new devices, which are expected to be named the Galaxy S9 and Galaxy S9+, with surprisingly few leaks. Samsung will reportedly show off the smartphone and announce the official specifications along with the release date and pricing information at its MWC keynote event.
Thanks to the rumor mill, there are potential specifications floating about with a few conflicting bits of information particularly where the fingerprint scanner is concerned. Looking around there seems to be several corroborated (but still rumored) specifications on the Galaxy S9 and Galaxy S9+. Allegedly the Galaxy smartphones will feature curved Super AMOLED displays with QHD+ (3200x1800) resolutions measuring 5.8" on the Galaxy S9 and 6.2" on the Galaxy S9+. Further, Smasung is equipping them with dual rear-facing cameras, USB-C, and 3.5mm headphone jack. There are conflicting rumors on the fingerprint scanner with some rumors stating it will feature a fingerprint sensor embedded in the display while other rumors claim that Samsung ran into issues and instead opted for a rear-mounted fingerprint sensor.
Internally, the Galaxy S9 and Galaxy S9+ will be powered by a Qualcomm Snapdragon 845 in the US and Samsung's own Exynos 9810 SoC outside of the US. Cat 18 LTE support is present in either case with faster than gigabit download speeds possible (though less in real world situations). The Galaxy S9 will allegedly be offered with 4GB of RAM and either 64GB of 128GB of storage while the S9+ will have 6GB of RAM and up to 256GB of internal flash storage.
In any case, the Galaxy S9 and S9+ are set to be powerhouses with the latest SoCs and (hopefully) large batteries for those infinity displays! It seems that we will have to wait another month for official information, but it should be out within the first quarter which is actually pretty fast considering it seems like the Galaxy S8 just came out (although it was actually last March heh). Mobile World Congress 2018 is scheduled from February 26th to March 1st in Barcelona, Spain.
What are your thoughts on the Galaxy S9 rumors so far? Do you plan to upgrade? This year may be the year I upgrade my LG G3 since the display is dying, but we'll see!
Subject: Mobile | March 1, 2017 - 02:26 PM | Tim Verry
Tagged: Snapdragon 625, opinion, MWC, keyone, enterprise, Cortex A53, blackberry, Android 7.1, Android
February is quite the busy month with GDC, MWC, and a flurry of technology announcements coming out all around the same time! One of the more surprising announcements from Mobile World Congress in Barcelona came from BlackBerry in the form of a new mid-range smartphone it is calling the KEYone. The KEYone is an Android 7.1 smartphone actually built by TCL with an aluminum frame, "soft touch" plastic back, curved edges, and (in traditional CrackBerry fashion) a full physical QWERTY keyboard!
The black and silver candy bar style KEYone (previously known as "Mercury") measures 5.78" x 2.85" x 0.37" and weighs 0.39 pounds. The left, right, and bottom edges are rounded and the top edge is flat. There are two bottom firing stereo speakers surrounding a USB Type-C port (Type-C 1.0 with USB OTG), a headphone jack up top, and volume, power, and convenience key buttons on the right side. The front of the device, which BlackBerry has designed to be comfortable using one handed, features a 4.5" 1620 x 1080 LCD touchscreen (434 PPI) protected by Gorilla Glass 4, a front facing camera with LED flash, and a large physical keyboard with straight rows of keys that have a traditional BlackBerry feel. The keyboard, in addition to having physical buttons, supports touch gestures such as swiping, and the spacebar has a fingerprint reader that early hands on reports indicate works rather well for quickly unlocking the phone. Further, every physical key can be programmed as a hot key to open any application with a long press (B for browser, E for email, ect).
On the camera front, BlackBerry is using the same sensor found in the Google Pixel which is a Sony IMX378. There is a 12MP f/2.0 rear camera with dual LED flash and phase detect auto focus on the back as well as a front facing 8MP camera. Both cameras can record 1080p30 video as well as support HDR and software features like face detection. Android Central reports that the camera software is rather good (it even has a pro mode) and the camera is snappy at taking photos.
Internally, BlackBerry has opted to go with squarely mid-range hardware which is disappointing but not the end of the world. Specifically, the KEYone is powered by a Snapdragon 625 (MSM8953) with eight ARM Cortex A53 cores clocked at 2GHz and an Adreno 506 GPU paired with 3GB of RAM and 32GB of internal storage. Wireless support includes dual band 802.11ac, FM, Bluetooth 4.2, GPS, NFC, and GSM/HSPA/LTE cellular radios. The smartphone uses a 3,505 mAh battery that is not user removable but at least supports Quick Charge 3.0 which can reportedly charge the battery to 50% in 36 minutes. Storage can be expanded via MicroSD cards. The smartphone is running Android 7.1.1 with some BlackBerry UI tweaks but is otherwise fairly stock. Under the hood however BlackBerry has hardened the OS and includes its DTEK security sftware along with promising monthly updates.
Not bad right? Looking at the specifications and reading/watching the various hands-on reports coming out it is really looking like BlackBerry (finally) has a decent piece of hardware for enterprise customers, niche markets (lawyers, healthcare, ect), and customers craving a physical keyboard in a modern phone. At first glance the BlackBerry KEYone hits all the key marks to a competitive Android smartphone... except for its $549 price tag. The KEYone is expected to launch in April.
No scroll ball? Blasphemy! (hehe)
Unfortunately, that $549 price is not a typo, and is what kills it even for a CrackBerry addict like myself. After some reflection and discussion with our intrepid smartphone guru Sebastian, I feel as though BlackBerry would have a competitive smartphone on its hands at $399, but at $549 even business IT departments are going to balk much less consumers (especially as many businesses embrace the BYOD culture or have grown accustomed to pricing out and giving everyone whatever basic Android or iPhone they can fit into the budget).
While similarly specced Snapdragon 625 smartphones are going for around $300, (e.g. Asus ZenPhone 3 at $265.98), there is some precedent for higher priced MSM8953-based smartphones such as the $449 Moto Z Play. There is some inherent cost in integrating a physical keyboard and BlackBerry has also hardened the Android 7.1.1 OS which I can see them charging a premium for and that business customers (or anyone that does a lot of writing on the go) that values security can appreciate. It seems like BlackBerry (and hardware partner TCL) has finally learned how to compete on the hardware design front in this modern Android-dominated market, and now they must learn how to compete on price especially as more and more Americans are buying unlocked and off-contract smartphones! I think the KEYone is a refreshing bit of hardware to come out of BlackBerry (I was not a fan of the Priv design) and I would like to see it do well and give the major players (Apple, Samsung, LG, Asus, Huawei, ect) some healthy competition with the twist of its focus on better security but in order for that to happen I think the BlackBerry KEYone needs to be a bit cheaper.
What are your thoughts on the KEYone and the return of the physical keyboard? Am I onto something or simply off my Moto Rokr on this?
Subject: General Tech, Graphics Cards | February 27, 2017 - 03:39 PM | Jeremy Hellstrom
Tagged: MWC, GDC, VRMark, Servermark, OptoFidelity, cyan room, benchmark
Futuremark are showing off new benchmarks at GDC and MWC, the two conferences which are both happening this week. We will have quite a bit of coverage this week as we try to keep up with simultaneous news releases and presentations.
First up is a new benchmark in their recently released DX12 VRMark suite, the new Cyan Room which sits between the existing two in the suite. The Orange Room is to test if your system is capable of providing you with an acceptable VR experience or if your system falls somewhat short of the minimum requirements while the Blue Room is to show off what a system that exceeds the recommended specs can manage. The Cyan room will be for those who know that their system can handle most VR, and need to test their systems settings. If you don't have the test suite Humble Bundle has a great deal on this suite and several other tools, if you act quickly.
Next up is a new suite to test Google Daydream, Google Cardboard, and Samsung Gear VR performance and ability. There is more than just performance to test when you are using your phone to view VR content, such as avoiding setting your eyeholes on fire. The tests will help you determine just how long your device can run VR content before overheating becomes an issue and interferes with performance, as well as helping you determine your battery life.
VR Latency testing is the next in the list of announcements and is very important when it comes to VR as high or unstable latency is the reason some users need to add a bucket to their list of VR essentials. Futuremark have partnered with OptoFidelity to produce VR Multimeter HMD hardware based testing. This allows you, and hopefully soon PCPer as well, to test motion-to-photon latency, display persistence, and frame jitter as well as audio to video synchronization and motion-to-audio-latency all of which could lead to a bad time.
Last up is the brand new Servermark to test the performance you can expect out of virtual servers, media servers and other common tasks. The VDI test lets you determine if a virtual machine has been provisioned at a level commensurate to the assigned task, so you can adjust it as required. The Media Transcode portion lets you determine the maximum number of concurrent streams as well as the maximum quality of those streams which your server can handle, very nice for those hosting media for an audience.
Expect to hear more as we see the new benchmarks in action.
Subject: General Tech, Mobile | February 27, 2017 - 11:12 AM | Sebastian Peak
Tagged: x50, Sub-6 Ghz, qualcomm, OFDM, NR, New Radio, MWC, multi-mode, modem, mmWave, LTE, 5G, 3GPP
Qualcomm has announced their first successful 5G New Radio (NR) connection using their prototype sub-6 GHz prototype system. This announcement was followed by today's news of Qualcomm's collaboration with Ericsson and Vodafone to trial 5G NR in the second half of 2017, as we approach the realization of 5G. New Radio is expected to become the standard for 5G going forward as 3GPP moves to finalize standards with release 15.
"5G NR will make the best use of a wide range of spectrum bands, and utilizing spectrum bands below 6 GHz is critical for achieving ubiquitous coverage and capacity to address the large number of envisioned 5G use cases. Qualcomm Technologies’ sub-6 GHz 5G NR prototype, which was announced and first showcased in June 2016, consists of both base stations and user equipment (UE) and serves as a testbed for verifying 5G NR capabilities in bands below 6 GHz."
The Qualcomm Sub-6 GHz 5G NR prototype (Image credit: Qualcomm)
Qualcomm first showed their sub-6 Ghz prototype this past summer, and it will be on display this week at MWC. The company states that the system is designed to demonstrate how 5G NR "can be utilized to efficiently achieve multi-gigabit-per-second data rates at significantly lower latency than today’s 4G LTE networks". New Radio, or NR, is a complex topic as it related to a new OFDM-based wireless standard. OFDM refers to "a digital multi-carrier modulation method" in which "a large number of closely spaced orthogonal sub-carrier signals are used to carry data on several parallel data streams or channels". With 3GPP adopting this standard going forward the "NR" name could stick, just as "LTE" (Long Term Evolution) caught on to describe the 4G wireless standard.
Along with this 5G NR news comes the annoucement of the expansion of its X50 modem family, first announced in October, "to include 5G New Radio (NR) multi-mode chipset solutions compliant with the 3GPP-based 5G NR global system", according to Qualcomm. This 'multi-mode' solution provides full 4G/5G compatibility with "2G/3G/4G/5G functionality in a single chip", with the first commercial devices expected in 2019.
"The new members of the Snapdragon X50 5G modem family are designed to support multi-mode 2G/3G/4G/5G functionality in a single chip, providing simultaneous connectivity across both 4G and 5G networks for robust mobility performance. The single chip solution also supports integrated Gigabit LTE capability, which has been pioneered by Qualcomm Technologies, and is an essential pillar for the 5G mobile experience as the high-speed coverage layer that co-exists and interworks with nascent 5G networks. This set of advanced multimode capabilities is designed to provide seamless Gigabit connectivity – a key requirement for next generation, premium smartphones and mobile computing devices."
Full press releases after the break.
Subject: Mobile | February 25, 2016 - 11:43 AM | Ryan Shrout
Tagged: MWC, MWC 2016, Samsung, galaxy, s7, s7 edge, qualcomm, snapdragon, snapdragon 820
I got to spend some time with the brand new Samsung Galaxy S7 and S7 Edge phones at MWC this week in Barcelona. Is this your next Android flagship phone?
Subject: Systems, Mobile | February 25, 2016 - 11:42 AM | Ryan Shrout
Tagged: MWC, MWC 2016, Huawei, matebook, Intel, core m, Skylake, 2-in-1
Huawei is getting into the PC business with the MateBook 2-in-1, built in the same vein as the Microsoft Surface Pro 4. Can they make a splash with impressive hardware and Intel Core m processors?
Subject: Graphics Cards, Mobile, Shows and Expos | February 23, 2016 - 08:46 PM | Scott Michaud
Tagged: raytracing, ray tracing, PowerVR, mwc 16, MWC, Imagination Technologies
For the last couple of years, Imagination Technologies has been pushing hardware-accelerated ray tracing. One of the major problems in computer graphics is knowing what geometry and material corresponds to a specific pixel on the screen. Several methods exists, although typical GPUs crush a 3D scene into the virtual camera's 2D space and do a point-in-triangle test on it. Once they know where in the triangle the pixel is, if it is in the triangle, it can be colored by a pixel shader.
Another method is casting light rays into the scene, and assigning a color based on the material that it lands on. This is ray tracing, and it has a few advantages. First, it is much easier to handle reflections, transparency, shadows, and other effects where information is required beyond what the affected geometry and its material provides. There are usually ways around this, without resorting to ray tracing, but they each have their own trade-offs. Second, it can be more efficient for certain data sets. Rasterization, since it's based around a “where in a triangle is this point” algorithm, needs geometry to be made up of polygons.
It also has the appeal of being what the real world sort-of does (assuming we don't need to model Gaussian beams). That doesn't necessarily mean anything, though.
At Mobile World Congress, Imagination Technologies once again showed off their ray tracing hardware, embodied in the PowerVR GR6500 GPU. This graphics processor has dedicated circuitry to calculate rays, and they use it in a couple of different ways. They presented several demos that modified Unity 5 to take advantage of their ray tracing hardware. One particularly interesting one was their quick, seven second video that added ray traced reflections atop an otherwise rasterized scene.
It was a little too smooth, creating reflections that were too glossy, but that could probably be downplayed in the material ((Update: Feb 24th @ 5pm Car paint is actually that glossy. It's a different issue). Back when I was working on a GPU-accelerated software renderer, before Mantle, Vulkan, and DirectX 12, I was hoping to use OpenCL-based ray traced highlights on idle GPUs, if I didn't have any other purposes for it. Now though, those can be exposed to graphics APIs directly, so they might not be so idle.
The downside of dedicated ray tracing hardware is that, well, the die area could have been used for something else. Extra shaders, for compute, vertex, and material effects, might be more useful in the real world... or maybe not. Add in the fact that fixed-function circuitry already exists for rasterization, and it makes you balance gain for cost.
It could be cool, but it has its trade-offs, like anything else.
Subject: Mobile | February 23, 2016 - 08:14 AM | Ryan Shrout
Tagged: snapdragon 820, Samsung, s7, qualcomm, MWC 2016, MWC, galaxy
No one is more excited to see the Snapdragon 820 processor in the Galaxy S7 (in some regions) than Qualcomm and Qualcomm's investors. Missing the S6 design win completely was a big blow to the SD 810 but the move to FinFET technology and a new SoC design have put the SD 820 back in the driver's seat for flagship smartphones it seems. While talking with Qualcomm's Peter Carson, Senior Director of Marketing and Modems, I learned quite a bit about the X12 LTE modem integration with the Galaxy S7 as well. As it turns out, the application processor itself isn't the only thing that has impressed OEMs or that will benefit consumers.
Modem marketers have a problem - quantifying the advantages of one LTE modem over another can be troublesome and complex. It's not as simple as X% faster or X% longer battery life, though those aspects of performance see improvement with better modem technology. And while of course the new announcement of Gigabit LTE is getting all the media attention at Mobile World Congress this week, there is a lot of excitement internally about the shipping implementation of the S7's modem.
The Galaxy S7 encompasses the most advanced Qualcomm TruSignal antenna technology implementation to date, combining several features to add real-world benefits to the cellular performance of the device.
First, the S7 will feature the most advanced version of the antenna tuner including a closed loop feedback cycle that will tweak antenna properties in real time based on sensor data and current signal properties. If the proximity sensor is activated or you are rotating or moving the mobile device, the receiver can adjust antenna properties to improve signal reliability measurably.
The best examples fall on the cell edge, where dropped calls are common and low voice quality are found. You can improve the gain of the antenna, that is adversly affected by simply holding the device, for much better reliability and even data throughput. That means fewer dropped calls and network drops for users that have moderate service reliability. Voice quality will get better as well, as the error rates that cause data loss in low signal areas will be reduced.
But even users that have a good signal can get benefits from the tech - gains of just 2-3 db will allow the modem and receiver to go into a lower power state, reducing 20% of the modem power draw. That won't equate to 20% total system battery life improvement but users that depend on their phones for extended use will see benefits from this integration.
Another TruSignal feature included in this modem implementation is smart transmit antenna switching. The simple explanation here is that the modem can swap which antennas are in receive and transmit modes in order to improve the transmit (upload) performance by as much as 10db! Based on properties of the antenna signal, position of the device and if you are in a heavy upload workload (posting some photos to Facebook, a video to YouTube), TruSignal allows the modem to change in real-time.
These techniques are additive so Galaxy S7 owners will find that both the antenna tuner and antenna switching are going to move the cellular performance forward, though Qualcomm isn't saying if ALL implementation of Samsung's new flagship smartphone will implement the features. I would guess that we'll see this on the Snapdragon 820 + X12 powered models only,
but I haven't learned yet which modem the Exynos-powered versions are using yet. Turns out the versions of the S7 that utilize the Samsung Exynos SoC are using a non-Qualcomm modem, so they will not support the features seen here.
Subject: Processors, Mobile | February 22, 2016 - 11:11 AM | Sebastian Peak
Tagged: TSMC, SoC, octa-core, MWC 2016, MWC, mediatek, Mali-T880, LPDDR4X, Cortex-A53, big.little, arm
MediaTek might not be well-known in the United States, but the company has been working to expand from China, where it had a 40% market share as of June 2015, into the global market. While 2015 saw the introduction of the 8-core Helio P10 and the 10-core helio X20 SoCs, the company continues to expand their lineup, today announcing the Helio P20 SoC.
There are a number of differences between the recent SoCs from MediaTek, beginning with the CPU core configuration. This new Helio P20 is a “True Octa-Core” design, but rather than a big.LITTLE configuration it’s using 8 identically-clocked ARM Cortex-A53 cores at 2.3 GHz. The previous Helio P10 used a similar CPU configuration, though clocks were limited to 2.0 GHz with that SoC. Conversely, the 10-core Helio X20 uses a tri-cluster configuration, with 2x ARM Cortex-A72 cores running at 2.5 GHz, along with a typical big.LITTLE arrangement (4x Cortex-A53 cores at 2.0 Ghz and 4x Cortex-A53 cores at 1.4 GHz).
Another change affecting MediaTek’s new SoC and he industry at large is the move to smaller process nodes. The Helio P10 was built on 28 nm HPM, and this new P20 moves to 16 nm FinFET. Just as with the Helio P10 and Helio X20 (a 20 nm part) this SoC is produced at TSMC using their 16FF+ (FinFET Plus) technology. This should provide up to “40% higher speed and 60% power saving” compared to the company’s previous 20 nm process found in the Helio X20, though of course real-world results will have to wait until handsets are available to test.
The Helio P20 also takes advantage of LPDDR4X, and is “the world’s first SoC to support low power double data rate random access memory” according to MediaTek. The company says this new memory provides “70 percent more bandwidth than the LPDDR3 and 50 percent power savings by lowering supply voltage to 0.6v”. Graphics are powered by ARM’s high-end Mali T880 GPU, clocked at an impressive 900 MHz. And all-important modem connectivity includes CAT6 LTE with 2x carrier aggregation for speeds of up to 300 Mbps down, 50 Mbps up. The Helio P20 also supports up to 4k/30 video decode with H.264/265 support, and the 12-bit dual camera ISP supports up to 24 MP sensors.
Specs from MediaTek:
- Process: 16nm
- Apps CPU: 8x Cortex-A53, up to 2.3GHz
- Memory: Up to 2 x LPDDR4X 1600MHz (up to 6GB) + 1x LPDDR3 933Mhz (up to 4GB) + eMMC 5.1
- Camera: Up to 24MP at 24FPS w/ZSD, 12bit Dual ISP, 3A HW engine, Bayer & Mono sensor support
- Video Decode: Up to 4Kx2K 30fps H.264/265
- Video Encode: Up to 4Kx2K 30fps H.264
- Graphics: Mali T-880 MP2 900MHz
- Display: FHD 1920x1080 60fps. 2x DSI for dual display
- Modem: LTE FDD TDD R.11 Cat.6 with 2x20 CA. C2K SRLTE. L+W DSDS support
- Connectivity: WiFiac/abgn (with MT6630). GPS/Glonass/Beidou/BT/FM.
- Audio: 110db SNR & -95db THD
It’s interesting to see SoC makers experiment with less complex CPU designs after a generation of multi-cluster (big.LITTLE) SoCs, as even the current flagship Qualcomm SoC, the Snapdragon 820, has reverted to a straight quad-core design. The P20 is expected to be in shipping devices by the second half of 2016, and we will see how this configuration performs once some devices using this new P20 SoC are in the wild.
Full press release after the break:
Subject: Mobile, Shows and Expos | February 22, 2016 - 05:09 AM | Ryan Shrout
Tagged: video, snapdragon 820, snapdragon, qualcomm, MWC 2016, MWC, LG, G5
The new LG G5 flagship smartphone offers a unique combination of form factor, performance and modularity that no previous smartphone design has had. But will you want to buy in?
I had a feeling that the Snapdragon 820 SoC from Qualcomm would make an impression at Mobile World Congress this year and it appears the company has improved on the previous flagship processor quite a bit. Both Samsung and LG have implemented it into the 2016 models, including the new G5, offering up a combination of performance and power efficiency that is dramatically better than the 810 that was hindered by heat and process technology concerns.
Along with the new processor, the G5 includes 4GB of RAM, 32GB of on-board storage with micro SD expansion, a 2,800 mAh battery and Android 6.0 out of the box. The display is 5.3-in and uses LG IPS technology with a 2560x1440 resolution, resulting in an impressive 554 PPI. LG has updated the USB connection to Type-C, a move that Samsung brushed off as unnecessary at this time.
The phones design is pretty standard and will look very familiar to anyone that has handled a G4 or similar flagship smartphone in recent months. It was bigger in the hand than the iPhone 6s but considering the panel size differences, it was more compact than expected.
Modularity is the truly unique addition to the G5 though. The battery is replaceable by sliding out a bottom portion of the phone, released with a tab on the left side. This allows LG to maintain the metal body construction but still offer flexibility for power users that are used to having extra batteries in their bag. This mechanism also means LG can offer add-on modules for the phone.
The first two available will be the LG Cam Plus and the LG Hi-Fi Plus. The Cam Plus gives the phone a camera grip as well as dedicated buttons for the shutter, video recording and zoom. Including an extra 1,200 mAh of battery is a nice touch too. The Hi-Fi Plus module has a DAC and headphone amplifier enbeded in it and can also be used connected to a PC through the USB Type-C connection; a nice touch.
I was overall pretty impressed with what LG had to offer with the G5. Whether or not the modular design gains any traction will have to be seen; I have concerns over the public's desire to carry around modules or affect the form factor of their phones so dramatically.