Subject: Processors, Mobile | December 1, 2015 - 07:30 AM | Scott Michaud
Tagged: TSMC, SoC, LG, Intel, arm
So this story came out of nowhere. Whether the rumors are true or false, I am stuck on how everyone seems to be talking about it with a casual deadpan. I spent a couple hours Googling whether I missed some big announcement that made Intel potentially fabricating ARM chips a mundane non-story. Pretty much all that I found was Intel allowing Altera to make FPGAs with embedded ARM processors in a supporting role, which is old news.
Image Credit: Internet Memes...
The rumor is that Intel and TSMC were both vying to produce LG's Nuclon 2 SoC. This part is said to house two quad-core ARM modules in a typical big.LITTLE formation. Samples were allegedly produced, with Intel's part (2.4 GHx) being able to clock around 300 MHz faster than TSMC's offering (2.1 GHz). Clock rate is highly dependent upon the “silicon lottery,” so this is an area that production maturity can help with. Intel's sample would also be manufactured at 14nm (versus 16nm from TSMC although these numbers mean less than they used to). LG was also, again allegedly, interesting in Intel's LTE modem. According to the rumors, LG went with TSMC because they felt Intel couldn't keep up with demand.
Now that the rumor has been reported... let's step back a bit.
I talked with Josh a couple of days ago about this post. He's quite skeptical (as I am) about the whole situation. First and foremost, it takes quite a bit of effort to port a design to a different manufacturing process. LG could do it, but it is questionable, especially for a second chip ever sort of thing. Moreover, I still believe that Intel doesn't want to manufacture chips that directly compete with them. x86 in phones is still not a viable business, but Intel hasn't given up and you would think that's a prerequisite.
So this whole thing doesn't seem right.
Subject: Displays | November 28, 2015 - 05:27 PM | Scott Michaud
Tagged: LG, lg display, oled
LG Display announced that they are investing $1.6 Billion USD to build an OLED panel factory in Paju, South Korea. This initial cost will cover the building, the “foundations” of the clean rooms, and basic infrastructure such as water and power. Construction will begin immediately. The plant is expected to cost $8.7 Billion USD by the time it starts producing displays, which the company anticipates for early 2018. It will produce panels for smart watches, cars, and even large TVs.
The shift from LCD to OLED has been anticipated for a while, but it seems like the former technology just kept remaining viable. It kept ahead of plasma technology, despite LCD being considered inferior in terms of contrast and maintainability by some, and outlived it. SED threatened to crush it, but never really became available because Canon basically misunderstood patent licensing terms from a Texas-based nanotech company. Mobile devices helped push LED panels away from TN technology and into IPS-like panels, which closed the gap between LCD and early OLED.
LCD would eventually need to reach its maximum viable potential though, and heightened availability of OLED could do it. Hopefully the technology makes it to consumer desktop panels relatively soon. Display manufacturers have been experimenting with higher refresh rates, better displays, and higher resolution recently, but adding OLED to the mix should push the industry toward focusing on contrast and color reproduction even more heavily.
Subject: Mobile | November 17, 2015 - 06:01 PM | Scott Michaud
Tagged: LG, tracfone, walmart
Don't expect much.
This $9.82 phone runs Android 4.4 KitKat with a 1.2 GHz, dual-core processor, which is backed by 512 MB of RAM. It has 4GB of internal storage, which LG advertises as having “up to 1.15 GB usable”. It is also listed as having about 7 hours of talk time, with almost 10 days of standby (although that is probably with next to nothing running). These components power a phone with a 3.8-inch, 480x320 display. It is not compatible with LTE, but it does have WiFi and 3G.
That said, the person writing this article is currently using an LG Optimus One from 2010, which runs Android 2.2 and doesn't even have enough on-device storage to install and use Firefox for Android. (My phone has ~60MB usable with basically nothing installed and a couple of built-in apps uninstalled.) So, for someone like me, this phone would actually be a step up and usable for something more than just phone calls.
... not much more, but maybe $10 worth of more?
Subject: Mobile | October 2, 2015 - 02:02 AM | Tim Verry
Tagged: LG, ultrathin, Broadwell, ips display
Earlier this week, LG revealed three new notebooks under its Gram series that are set to compete with Apple’s Macbook Air (The Verge has a photo comparison of the two) and various Ultrabooks from other manufacturers (e.g. Lenovo and Asus). The new series includes one 13-inch and two 14-inch laptops that weigh in at 2.16 pounds and are 0.5” thick. The LG Gram with 13” display is the smallest of the bunch at 11.9” x 8.4” x 0.5” and the chassis is constructed of magnesium and polycarbonate (plastic). Meanwhile, the two notebooks with the 14” display measure 12.8” x 8.94” x 0.5” and feature a body made from a combination of carbon-magnesium and lithium-magnesium alloys. The difference in materials accounts for the larger notebooks hitting the same weight target (2.16 lbs).
The 14-inch LG Gram 14 (gram-14Z950-A.AA4GU1) notebook.
LG is packing every Gram notebook with a 1080p IPS display (13.3 or 14 inches), dual mics, a 1.3 MP webcam, six row island-style keyboard, and a spacious track pad. External IO includes two USB 3.0 ports, HDMI output, micro SD card slot, and a micro USB port that (along with the included dongle) supports the 10/100 Ethernet network connection.
The base Gram 13-inch comes in Snow White while both Gram 14-inch notebooks are clad in Champagne Gold.
The LG Gram 13 Broadwell-powered laptop (gram-13Z950-A.AA3WU1).
Internally, LG has opted to go with Intel’s Broadwell processor and its built-in HD 5500 GPU. The LG Gram 13 uses the Intel Core i5-5200U (2 cores, 4 threads at 2.2-2.7GHz). The 14-inch models can be configured with an Intel i5 or an Intel Core i7-5500U which is a dual core (with HyperThreading for four threads) processor clocked at 2.4 GHz that can boost to 3.0 GHz. Additional specifications include 8GB of DDR3L memory, a solid state drive (128 GB on the Gram 13, up to 256 GB on the Gram 14), Intel 802.11ac Wi-Fi, and rated battery life of up to 7.5 hours (which is not great, but not too bad).
The Gram series is LG’s first major thin-and-light entry into the US market, and while there are some compromises made to get the portability, the price points are competitive and seem to be priced right. Interestingly, LG is aiming these notebooks as Macbook Air competitors, allegedly offering you a larger, yet lighter, notebook. It is not actually the lightest notebook on the market, however. Below is a brief point of (weight) comparison to some of the major recent thin-and-lights, the Gram is going up against:
- 12” Apple MacBook: 2.03 lbs
- 11” Apple MacBook Air: 2.38 lbs
- 13” Apple MacBook Air: 2.96 lbs
- 13.3" ASUS Zenbook UX305FA (Core M): 2.65 lbs
- 13.3" ASUS Zenbook UX301LA (Core i7): 3.08 lbs
- 13.3” LaVie Z: 1.87 lbs
- 13.3” LaVie Z 360: 2.04 lbs
- 12.2" Samsung ATIV Book 9: 2.09 lbs
We will have to wait for reviews to see how the build quality stacks up, especially the 14-inch models using the lithium-magnesium bodies which, while light, may not be the sturdiest flex-wise. If they can hold up to the stress of the daily commuter, the retail pricing is far from exorbitant and if you can live with the compromises fairly attractive.
Subject: Displays | June 9, 2015 - 01:51 AM | Sebastian Peak
Tagged: UHD, LG, ips monitor, gaming monitor, freesync, amd, 4k, 27MU67-B
LG announced a new 4K monitor today, and since it's from LG you know there has to be an IPS panel inside.
The 27MU67-B boasts a 3840x2160 UHD/4K IPS panel and supports AMD FreeSync variable refresh rate technology, though the panel appears to only support up to 60 Hz according to the official specs. Speaking of, here's the full rundown:
- Panel Type: IPS
- Color Gamut (CIE1931): SRGB 99%
- Aspect Ratio: 16:9
- Resolution: 3840x2160
- Brightness (cd/m2): 300 cd/m2
- Contrast Ratio: 5M:1
- Response Time (GTG): 5ms
- Refresh Rate: 60 Hz: 178 / 178
- Viewing Angle: Hard Coating (3H), anti-glare
- DVI-D x1
- HDMI x2
- Display Port x1
- Black Stabilizer: Black Equalizer
- DAS Mode: Yes
- Reader Mode: Yes
- PC: Yes
- DDC/CI: Yes
- HDCP: Yes (2.2)
- FreeSync: Yes (w/ DP, mDP)
- Factory Calibration: Yes
- Super+ Resolution: Yes
- Screen-split: Yes (Software)
- Flicker Safe: Yes
- Pivot: Yes
- Dual Controller: Yes (Software)
The 27MU67-B also features factory calibration and 99% sRGB color the display could be used for more critical work (yes, gaming can be categorized as "critical").
The LG 27MU67-B has an MSRP of $599.99 and availability is listed as “coming soon”.
LG Australia published a product page for their LG 27MU67 monitor, which the rest of the company doesn't seem to acknowledge the existence of. It is still online, even after three days worth of time that someone could have used to pull the plug. This one is interesting for a variety of reasons: it's 4K, it's IPS, and it supports AMD FreeSync. It is also relatively cheap for that combination, being listed at $799 AUD RRP.
Some websites have converted that to ~$610 to $620 USD, but it might even be less than that. Australian prices are often listed with their federal tax rolled in, which would yield a price that is inflated about 10%. It is possible, though maybe wishful thinking, that this monitor could retail in the ~$500 to $550 price range for the United States (if it even comes to North America). Again, this is a 4K, IPS, FreeSync panel.
Very little is posted on LG's website and thus it is hard to tell how good of an IPS panel this is. It is listed as 99% SRGB coverage, which is good for typical video but not the best if you are working on printed content, such as magazine illustrations. On the other hand, this is a gaming panel, not a professional one. Update (May 29, 2015): It also has 10-bit (per channel) color. It sounds like it is true 10-bit, not just a look-up table, but I should note that it doesn't explicitly say that.
Again, pricing and availability is up in the air, because this is not an official announcement. It is listed to launch in Australia for $799 AUD, though.
Subject: General Tech | April 30, 2015 - 01:15 PM | Jeremy Hellstrom
Tagged: snapdragon 810, qualcomm, LG, Samsung
There have been many stories about Qualcomm's difficulties lately, from the court case with NVIDIA to Samsung and LG not using their Snapdragon 810 for their new smartphones. Qualcomm has struck back at the speculations about problems with this chip that rose from these decisions, pointing out that Microsoft, Xiaomi, Motorola and Sony will all be releasing devices with the Snapdragon 810 in the near future. LG put in their two cents as well, pointing out their decision to use the 808 chip was made over a year ago and they are still planning on utilizing the next generation Snapdragon 820 in the future, not to mention that they use the 810 in their G Flex 2. Samsung has also shown their belief in Qualcomm's products considering they will be fabbing the 820. You can see a short video of an interview with Qualcomm about this topic over at The Register.
"QUALCOMM HAS DEBUNKED chatter that LG ditched its octa-core Snapdragon 810 chip for the G4 owing to overheating problems."
Here is some more Tech News from around the web:
- Windows 10 is now available for the Raspberry Pi 2 @ The Inquirer
- 'Android on Windows': Microsoft tightens noose around neck, climbs on chair @ The Register
- This is Spartan? No, it's Microsoft Edge, Son of Internet Explorer @ The Register
- IBM creates quantum super-conductor in square formation @ The Inquirer
- Disney Replaces Longtime IT Staff With H-1B Workers @ Slashdot
- Using Asus Transfer Express: A Multi-Platform Control Hub @ Kitguru
Subject: Mobile | April 3, 2015 - 08:00 AM | Tim Verry
Tagged: snapdragon 801, smartphone, quad hd, LG, Android 5.0
Just Delivered is a section of PC Perspective where we share some of the goodies that pass through our labs that may or may not see a review, but are pretty cool none the less.
Find the LG G3 on Amazon!
- LG G3 32GB Unlocked US Version (T-Mobile) - $540
- LG G3 16GB Unlocked International Version - $380
- LG G3 32GB Unlocked International Version - $435
Last week I stopped by the T-Mobile store in the mall, handed over two old phones, and ported over two lines from Verizon. I walked out with a cheaper contract with unlimited data (versus 4GB on Verizon) and a shiny new (to me, it's been out for awhile) LG G3. Which brings me to this post.
First off, the LG G3 is huge. This is the
smallest tablet largest smartphone I have ever owned. Measuring 146.3 x 74.6 x 8.9 mm, the 149g smartphone is slightly smaller than the Apple iPhone 6 Plus and a bit chunkier at its thickest point. It is however easier to hold and operate (especially one handed) than the iDevice. The is dominated by a large 5.5-inch Quad HD IPS display (2560 x 1440 resolution) and features round edges and a curved back. I chose the white version, but it also comes in black, blue, gold, red, and purple (the international versions). Except for the top bezel that holds the webcam, light sensor, and speaker, and that bit of empty space below the display with the LG logo, the G3 has super thin bezels. In fact, the phone is not much larger than the display (certainly width wise).
The LG G3's display looks amazing with sharp text and extremely detailed videos (the included 4k content is great). It is highly reflective and I had to crank the brightness all the way up to be able to read it under direct sunlight (my S4 was similar in this respect). In other lighting situations, it worked really well.
An infrared transmitter, microphone, micro USB port, and 3.5mm audio jack are placed along the top and bottom edges of the phone. Like its predecessor (the G2), LG has placed the power and volume buttons on the back of the device rather than the sides (Update: I am generally liking this setup now). The recessed buttons sit beneath the camera lens and are easier to find and use than I expected them to be. Now that I am getting used to them, I think LG is onto something (good) with this button placement. There is also a 1-watt speaker in the lower left corner of the back cover for media playback and speakerphone calls. For a smartphone speaker it can get fairly loud and does what it is supposed to. It is not spectacular but it is also not bad. I mostly use headphones but it's nice to know that I have a decent speaker should I want to share my music.
The curved back cover makes it easy to hold in one hand (even if I can't hit all the on-screen buttons without a longer thumb heh) and I feel like it will be dropped less frequently than my previous phone (the Galaxy S4) as a result of the form factor. One big change with the G3, for me, is the lack of buttons below the display (capacitive or physical), but I am slowly getting used to the on-screen navigation on Android (especially once I figured out I could long press the recent apps button to regain the menu button I miss from my S4).
Aside from the display, the G3 features a 2.1MP front facing camera and a 13MP rear camera. The rear camera is where things get interesting because it is paired with a dual LED flash, laser focus, and optical image stabilization (OIS) technology. Outdoor shots were excellent and indoor shots with enough lighting were great. In low light situations, the camera left something to be desired, and I was kind of disappointed. Using the flash does help and it is quite bright. However, I tend to not like using the flash unless I have to as photos always look less natural. For as small as the camera is though (the lens and sensor are tiny), it does pretty well. In good lighting conditions it is trounces my S4 but the (upgrade) is much less noticeable with less light (the G3 does have a much brighter flash).
The laser focus is a really cool feature that works as advertised. The camera focuses extremely quickly (even in low light) allowing me a much better chance to capture the moment. It also refocuses (tap to focus) quickly.
The camera software is not as full featured as other smartphones I have used, however. I was put off by this at first as someone that likes to tinker with these things but at the end of the day it does what it is supposed to and it does it well (which is to take photos). You can swipe to switch between the front and back cameras, choose from a couple preset modes, and adjust basic settings like resolution, voice controls, HDR, and shutter timer. For "selfie" fans, LG has a feature where you can make a fist in the air and it will start a countdown timer. While I have not tried the voice commands, I did try the gesture and it does work well.
Anyways, before this turns into a full review (heh), it might help to know what's under the hood as well. The G3 is powered by a Qualcomm Snapdragon 801 SoC which pairs four Krait 400 CPU cores clocked at 2.5 GHz with an Adreno 330 GPU. The phone comes with either 16GB internal storage and 2GB of RAM or 32GB internal storage and 3GB RAM. I chose the higher end model to get the extra RAM just in case as I plan to have this phone for a long time. It supports 4G LTE, 802.11ac Wi-Fi, Bluetooth 4.0, GPS, and NFC (Near Field Communication). You can also use it with Qi-enabled wireless chargers if you purchase a supporting back cover. The G3 is running Android 4.4.2 on T-Mobile but it does support Android 5.0 and some carriers have already pushed out updates.
The G3 comes with a 3,000 mAh battery and a 1.8 amp USB charger. It does take awhile to charge this thing (my 2.1 amp Samsung charger is a bit faster), but once it is fully charged it will easily last all day including listening to streaming music and audiobooks, text messaging, and web browsing. (Update: I don't have specific battery life numbers yet, but I generally only need to charge it once a day so long as I keep the display brightness around half. If I crank the brightness all the way up I can almost feel the battery draining by the second heh.)
Like Samsung, LG has a battery saving feature that will kick in at 30% to conserve battery but turning down the screen brightness, turning off radios that are not active, and a few other configurable battery drainers (haptic feedback, notification lights, and account syncing). I do like their battery settings page as it will estimate the time needed to charge and the time remaining as it discharges along with a nice graph of battery percentages over time. Other Android phones have something similar but LG has fleshed it out a bit more.
Just for fun, I installed 3DMark and ran the Ice Storm benchmark. The LG G3 maxed out the Ice Storm test and scored 10,033 points in Ice Storm Extreme. Further, it scored 16,151 in Ice Storm Unlimited. In comparison, the (apparently extremely popular judging by the feedback) Samsung Galaxy Centura scored 536 in Ice Storm and 281 in Ice Storm Extreme respectively (hehe). My Galaxy S4 is no longer available for me to test, but TweakTown was able to get 6,723 in the Ice Storm Extreme test.
LG packs light with only the smartphone, USB cable, USB charger, and a quck start guide included in the box. No headphones or extra accessories here.
In all, so far so good with the LG G3. I am very happy with my purchase and would recommend checking it out if you are in the market for a large display-packing smartphone that's not an iPhone 6 Plus or Galaxy Note 4 (which Ryan recently reviewed). If you want the latest and greatest Android phone and can afford the premium (about $300 more in my case when I compared them), grab the Note 4. On the other hand, if you are looking for a Android smartphone with a large display, good battery life, and decent hardware specifications, the LG G3 is a respectible choice that delivers and doesn't break the bank.
Have you tried out the G3? What do you think about the trend for larger and thinner smartphones? This is hardly an exhaustive review and there are things I didn't get into here. After all, I'm still checking out my G3. With that said, from first impressions and about a week of usage it seems like a really solid device. I've since fitted it with a screen protector and a case so as to not break it – especially that hi-res display!
A monitor for those that like it long
It takes a lot to really impress someone that sits in front of dual 2560x1600 30-in IPS screens all day, but the LG 34UM95 did just that. With a 34-in diagonal 3440x1440 resolution panel forming a 21:9 aspect ratio, built on LG IPS technology for flawless viewing angles, this monitor creates a work and gaming experience that is basically unmatched in today's market. Whether you need to open up a half-dozen Excel or Word documents, keep an eye on your Twitter feed while looking at 12 browsers or run games at near Eyefinity/Surround levels without bezels, the LG 34UM95 is a perfect option.
Originally priced north of $1200, the 34UM95 and many in LG's 21:9 lineup have dropped in price considerably, giving them more avenues into users' homes. There are obvious gaming advantages to the 34-in display compared to a pair of 1920x1080 panels (no bezel, 20% more pixels) but if you have a pair of 2560x1440 screens you are going to be giving up a bit. Some games might not handle 21:9 resolutions well either, just as we continue to see Eyefinity/Surround unsupported occasionally.
Productivity users will immediately see an improvement, both for those us inundated with spreadsheets, web pages and text documents as well as the more creative types with Adobe Premiere timelines. I know that Ken would definitely have approved us keeping this monitor here at the office for his use.
Check out the video above for more thoughts on the LG 34UM95!
What is FreeSync?
FreeSync: What began as merely a term for AMD’s plans to counter NVIDIA’s launch of G-Sync (and mocking play on NVIDIA’s trade name) has finally come to fruition, keeping the name - and the attitude. As we have discussed, AMD’s Mantle API was crucial to pushing the industry in the correct and necessary direction for lower level APIs, though NVIDIA’s G-Sync deserves the same credit for recognizing and imparting the necessity of a move to a variable refresh display technology. Variable refresh displays can fundamentally change the way that PC gaming looks and feels when they are built correctly and implemented with care, and we have seen that time and time again with many different G-Sync enabled monitors at our offices. It might finally be time to make the same claims about FreeSync.
But what exactly is FreeSync? AMD has been discussing it since CES in early 2014, claiming that they would bypass the idea of a custom module that needs to be used by a monitor to support VRR, and instead go the route of open standards using a modification to DisplayPort 1.2a from VESA. FreeSync is based on AdaptiveSync, an optional portion of the DP standard that enables a variable refresh rate courtesy of expanding the vBlank timings of a display, and it also provides a way to updating EDID (display ID information) to facilitate communication of these settings to the graphics card. FreeSync itself is simply the AMD brand for this implementation, combining the monitors with correctly implemented drivers and GPUs that support the variable refresh technology.
A set of three new FreeSync monitors from Acer, LG and BenQ.
Fundamentally, FreeSync works in a very similar fashion to G-Sync, utilizing the idea of the vBlank timings of a monitor to change how and when it updates the screen. The vBlank signal is what tells the monitor to begin drawing the next frame, representing the end of the current data set and marking the beginning of a new one. By varying the length of time this vBlank signal is set to, you can force the monitor to wait any amount of time necessary, allowing the GPU to end the vBlank instance exactly when a new frame is done drawing. The result is a variable refresh rate monitor, one that is in tune with the GPU render rate, rather than opposed to it. Why is that important? I wrote in great detail about this previously, and it still applies in this case:
The idea of G-Sync (and FreeSync) is pretty easy to understand, though the implementation method can get a bit more hairy. G-Sync (and FreeSync) introduces a variable refresh rate to a monitor, allowing the display to refresh at wide range of rates rather than at fixed intervals. More importantly, rather than the monitor dictating what rate this refresh occurs at to the PC, the graphics now tells the monitor when to refresh in a properly configured G-Sync (and FreeSync) setup. This allows a monitor to match the refresh rate of the screen to the draw rate of the game being played (frames per second) and that simple change drastically improves the gaming experience for several reasons.
Gamers today are likely to be very familiar with V-Sync, short for vertical sync, which is an option in your graphics card’s control panel and in your game options menu. When enabled, it forces the monitor to draw a new image on the screen at a fixed interval. In theory, this would work well and the image is presented to the gamer without artifacts. The problem is that games that are played and rendered in real time rarely fall into a very specific frame rate. With only a couple of exceptions, games frame rates will fluctuate based on the activity happening on the screen: a rush of enemies, a changed camera angle, an explosion or falling building. Instantaneous frame rates can vary drastically, from 30, to 60, to 90, and force the image to be displayed only at set fractions of the monitor's refresh rate, which causes problems.