Subject: General Tech | August 28, 2015 - 04:40 PM | Jeremy Hellstrom
Tagged: google, chrome, flash, apple
The good news from Google is that as of next month, Flash ads will be 'Click to Play' when you are browsing in Chrome. This will be nice for the moving ads but even better for defeating those sick minded advertisers who think audio ads are acceptable. However this will hurt websites which depend on ad revenue ... as in all of the ones that are not behind a paywall which have Flash based ads. The move will make your web browsing somewhat safer as this will prevent the drive-by infections which Flash spreads like a plague infested flea and as long as advertisers switch to HTML 5 their ads will play and revenue will continue to come in.
The news of Chrome's refusal to play Flash ads is tempered somewhat by Google's decision to put advertising ahead of security for Apple devices. The new iOS 9 uses HTTPS for all connectivity, providing security and making it more difficult for websites to gather personalized data but as anyone who uses HTTPS Everywhere already knows, not all advertisements are compliant and are often completely blocked from displaying. To ensure that advertisers can display on your iOS9 device Google has provided a tool to get around Apple's App Transport Security thus rendering the protection HTTPS offers inoperative. Again, while sites do depend on advertisements to exist, sacrificing security to display those ads is hard to justify.
"The web giant has set September 1, 2015 as the date from which non-important Flash files will be click-to-play in the browser by default – effectively freezing out "many" Flash ads in the process."
Here is some more Tech News from around the web:
- BitTorrent kills bug that turns networks into a website-slaying weapon @ The Register
- Windows 10 download Build 10532 arrives but Chrome borkage continues @ The Inquirer
- Turning a typewriter into a mechanical keyboard @ Hack a Day
Subject: General Tech | July 17, 2015 - 03:36 PM | Jeremy Hellstrom
Tagged: SIM, Samsung, apple, Vodafone, AT&T, orange, Deutsche Teleko
If you hate trying to read the numbers off of your SIM card and are sick of their continual shrinking then Apple and Samsung's plan to make the SIM card extinct may be good news. If you have a phone with dual SIMs or remove the SIM when you travel to ensure no roaming charges will be applied to you then perhaps you are less than happy to hear these companies want to replace the physical SIM with a software one. It will make changing providers and phones easier but making it a permanent part of the phone could have some drawbacks. Those of you who have a new iPad Air and iPad Mini may already be familiar with the soft SIMs, if you want to read more you can catch up at The Register.
"Smartphone goliaths Apple and Samsung are reportedly confabulating at a high level regarding plans for hardware which would replace SIM cards in mobile devices - this technology would be embedded in phones, tablets etc and would not be exchangeable to different devices."
Here is some more Tech News from around the web:
- Pray for AMD @ The Register
- Windows 10: Microsoft confirms it will offer software on USB flash drives @ The Inquirer
- TSMC says 10nm on schedule @ DigiTimes
- TSMC, Samsung start A9 chip mass production @ DigiTimes
- Windows RT on life support: Microsoft vows it won't pull the plug @ The Register
- ARM servers look to have legs as OVH boots up Cavium cloud @ The Register
Battle of the Sixes, they call it
GameBench is a low-level application released in 2014 that attempts to bring the technical analysis and benchmarking capability of the PC to the mobile device. You might remember that I showed you some early results and discussed our use of the GameBench testing capability in my Dell Venue 8 7000 review a few months back; my understanding and practice of using the software was just beginning at that time and continues to grow as I spend time with the software.
The idea is simple yet powerful: GameBench allows Android users, and soon iOS users, the ability to monitor frame rates of nearly any game or 3D application that you can run on your phone or tablet to accurately measure real-world performance. This is similar to what we have done for years on the PC with FRAPS and allows us to gather average frames per second data over time. This is something that was previously unavailable to consumers or press for that matter and could be a very powerful tool for device to device comparisons going forward. The ability to utilize actual games and applications and gather benchmark data that is accurate to consumer experiences, rather than simply synthetic graphics tests that we have been forced to use in the past, will fundamentally change how we test and compare mobile hardware.
Image source: GameBench.net
Today, GameBench itself released a small report meant to showcase some of the kinds of data the software can gather while also revealing early support for Apple’s iPhone and iPad devices. Primary competitors for the comparison include the Apple iPhone 6, the Samsung Galaxy S6, HTC One M9 and Motorola Nexus 6. I was able to get an early look at the report and offer some feedback, while sharing with our readers my views on the results.
GameBench tested those four devices in a total of 10 games:
- Asphalt 8: Airborne
- Real Racing 3
- Dead Trigger 2
- Kill Shot
- Modern Combat 5: Blackout
- Boom Beach
- XCOM: Enemy Unknown
- GTA: San Andreas
- Marvel: Contest of Champions
- Monument Valley
These games all vary in price and in play style, but they all are in the top 50 games lists for each platform and are known for their graphically intense settings and look.
Subject: General Tech | November 27, 2014 - 09:29 PM | Scott Michaud
Tagged: apple, safari, google, yahoo, bing, microsoft, mozilla
After Mozilla inked the deal with Yahoo, the eyes turned to Apple and its Safari browser. Currently, the default search engine is Google on both iOS and OSX, although Bing is the primary engine used for other functions, like Siri and Spotlight. Until early 2015, they are tied into a contract with Google for those two browsers, but who will get the new contract?
Apparently Yahoo and Microsoft have both approached the company for the position, and Apple is not ruling any of the three out. Probably the most interesting part is how Yahoo is genuinely taking the search business seriously. The deal with Mozilla is fairly long-term, and with Yahoo approaching Apple as well, it probably was not just charity on Mozilla's part because no-one else wanted to be Firefox's default. Yahoo would probably need some significant monetary backing for an Apple deal, which suggests the same for their deal with Mozilla.
If both Mozilla and Apple leave Google, it will take a significant chunk out of the search engine. Power users, like those who read this site, will likely be unaffected if they care, because of how low the barrier is to change the default search engine. On the other hand, even the most experienced user will often accept default settings until there is a reason to change. The winning party will need to have a good enough product to overcome that initial shock.
But the money will at least give them a chance when the decision comes into effect. That is, unless the barrier to changing default search engines is less than the barrier to changing default web browsers.
Google will always be default on Google Chrome.
Subject: General Tech | November 25, 2014 - 12:19 PM | Jeremy Hellstrom
Tagged: osx, ubuntu 14.10, linux, apple, OS
Over at Phoronix you can see a comparison between the new Apple OS X 10.10 and the newest release of Ubuntu 14.10. This offers an interesting comparison in performance as both OSes were tested on the same system, a 2013 Macbook Air with a Haswell i5-4250U with onboard HD 5000, 4GB of DDR3-1600MHz and the Apple branded SSD. For content creators and those with no interest in running Windows it highlights the contrasts you can expect between the two operating systems in data transfer and graphics applications. Right from the start you can see that the contest is somewhat one sided, the first benchmark, PostMark, showed the disk with Ubuntu installed performing three times as fast as with OSX. The results get a little closer in some benchmarks but overall Linux outpaces OSX significantly.
"While I delivered some OS X 10.10 Yosemite preview benchmarks back in August, here's my first tests of the official release of Apple OS X 10.10.1 compared to Ubuntu 14.10 Linux. Tests were done of OS X 10.9.5 and OS X 10.10.1 against Ubuntu 14.10 Utopic Unicorn when running the benchmarks under both GCC and LLVM Clang compilers."
Here is some more Tech News from around the web:
- ARM seeing troubles penetrating into PC, server industries; IoT becomes new battlefield for Intel, ARM @ DigiTimes
- BitTorrent users are 170 percent more likely to download legally than non-torrenters @ The Inquirer
- Researchers Say the Tech Worker Shortage Doesn't Really Exist @ Slashdot
- Regin Malware In EU Attack Linked To US and British Intelligence Agencies @ Slashdot
- You stupid BRICK! PCs running Avast AV can't handle Windows fixes @ The Register
- Bond villains lament as Wicked Lasers withdraw death ray @ The Register
Subject: General Tech | October 30, 2014 - 01:01 PM | Jeremy Hellstrom
Tagged: iPad Air 2, apple
There were long lineups of people desperate to get their hands on the new iPad Air 2, regardless of the fact that the internals cost a mere $1 more than the initial model. To be fair that is not the best way to judge the quality of the upgrade, that should rely more on the screen quality ... which is exactly the same in all respects except for a new anti-reflective coating. Apple is also reducing their markup, from 45-61% down to a paltry 45-57% for this generation so at least that $1.00 extra in materials will not raise your purchase price overly. The internals such as the TSMC made A8X and camera match the iPhone 6 to a large extent making it a more powerful phablet than the original, so don't disparage it too much. You can read more on The Register if you are into fruit.
"New iPad Air 2 components cost Apple just one dollar more than the previous model, according to the teardown bods at IHS."
Here is some more Tech News from around the web:
- The TR Podcast 164: We get twitchy over Apples, Nexuses, and beefy games
- Hey - who wants 4.8 TERABYTES almost AS FAST AS MEMORY? @ The Register
- Drupal Warns Users of Mass, Automated Attacks On Critical Flaw @ Slashdot
- More Microsoft staffers shown the door in Round 3 of job cuts @ The Register
Subject: Editorial, General Tech, Systems | October 17, 2014 - 03:22 PM | Scott Michaud
Tagged: Thunderbolt 2, thunderbolt, mac mini, mac, Intel, haswell, apple
I was not planning to report on Apple's announcement but, well, this just struck me as odd.
So Apple has relaunched the Mac Mini with fourth-generation Intel Core processors, after two years of waiting. It is the same height as the Intel NUC, but it also almost twice the length and twice the width (Apple's 20cm x 20cm versus the NUC's ~11cm x 11cm when the case is included). So, after waiting through the entire Haswell architecture launch cycle, right up until the imminent release of Broadwell, they are going with the soon-to-be outdated architecture, to update their two-year-old platform?
((Note: The editorial originally said "two-year-old architecture". I thought that Haswell launched about six months earlier than it did. The mistake was corrected.))
I wonder if, following the iTunes U2 deal, this device will come bundled with Limp Bizkit's "Nookie"...
The price has been reduced to $499, which is a welcome $100 price reduction especially for PC developers who want a Mac to test cross-platform applications on. It also has Thunderbolt 2. These are welcome additions. I just have two, related questions: why today and why Haswell?
The new Mac Mini started shipping yesterday. 15-watt Broadwell-U is expected to launch at CES in January with 28W parts anticipated a few months later, for the following quarter.
Subject: General Tech | October 2, 2014 - 02:05 PM | Ken Addison
Tagged: X99 Classified, X99, video, tlc, tegra k1, ssd, Samsung, podcast, nvidia, micron, M600, iphone 6, g-sync, freesync, evga, broadwell-u, Broadwell, arm, apple, amd, adaptive sync, a8, 840 evo, 840
PC Perspective Podcast #320 - 10/02/2014
Join us this week as we discuss the Micron M600 SSD, NVIDIA and Adaptive Sync, Windows 10 and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:27:21
One Small Step
While most articles surrounding the iPhone 6 and iPhone 6 Plus this far have focused around user experience and larger screen sizes, performance, and in particular the effect of Apple's transition to the 20nm process node for the A8 SoC have been our main questions regarding these new phones. Naturally, I decided to put my personal iPhone 6 though our usual round of benchmarks.
First, let's start with 3DMark.
Comparing the 3DMark scores of the new Apple A8 to even the last generation A7 provides a smaller improvement than we are used to seeing generation-to-generation with Apple's custom ARM implementations. When you compare the A8 to something like the NVIDIA Tegra K1, which utilizes desktop-class GPU cores, the overall score blows Apple out of the water. Even taking a look at the CPU-bound physics score, the K1 is still a winner.
A 78% performance advantage in overall score when compared the A8 shows just how much of a powerhouse NVIDIA has with the K1. (Though clearly power envelopes are another matter entirely.)
If we look at more CPU benchmarks, like the browser-based Google Octane and SunSpider tests, the A8 starts to shine more.
While the A8 edges out the A7 to be the best performing device and 54% faster than the K1 in SunSpider, the A8 and K1 are neck and neck in the Google Octane benchmark.
Moving back to a graphics heavy benchmark, GFXBench's Manhattan test, the Tegra K1 has a 75% percent performance advantage over the A8 though it is 36% faster than the previous A7 silicon.
These early results are certainly a disappointment compared to the usual generation-to-generation performance increase we see with Apple SoCs.
However, the other aspect to look at is power efficiency. With normal use I have noticed a substantial increase in battery life of my iPhone 6 over the last generation iPhone 5S. While this may be due to a small (about 1 wH) increase in battery capacity, I think more can be credited to this being an overall more efficient device. Certain choices like sticking to a highly optimized Dual Core CPU design and Quad Core GPU, as well as a reduction in process node to 20nm all contribute to increased battery life, while surpassing the performance of the last generation Apple A7.
In that way, the A8 moves the bar forward for Apple and is a solid first attempt at using the 20nm silicon technology at TSMC. There is a strong potential that further refined parts (like the expected A8x for the iPad revisions) Apple will be able to further surpass 28nm silicon in performance and efficiency.
Subject: Graphics Cards, Processors, Mobile | September 29, 2014 - 01:53 AM | Scott Michaud
Tagged: apple, a8, a7, Imagination Technologies, PowerVR
First, Chipworks released a dieshot of the new Apple A8 SoC (stored at archive.org). It is based on the 20nm fabrication process from TSMC, which they allegedly bought the entire capacity for. From there, a bit of a debate arose regarding what each group of transistors represented. All sources claim that it is based around a dual-core CPU, but the GPU is a bit polarizing.
Image Credit: Chipworks via Ars Technica
Most sources, including Chipworks, Ars Technica, Anandtech, and so forth believe that it is a quad-core graphics processor from Imagination Technologies. Specifically, they expect that it is the GX6450 from the PowerVR Series 6XT. This is a narrow upgrade over the G6430 found in the Apple A7 processor, which is in line with the initial benchmarks that we saw (and not in line with the 50% GPU performance increase that Apple claims). For programmability, the GX6450 is equivalent to a DirectX 10-level feature set, unless it was extended by Apple, which I doubt.
Image Source: DailyTech
DailyTech has their own theory, suggesting that it is a GX6650 that is horizontally-aligned. From my observation, their "Cluster 2" and "Cluster 5" do not look identical at all to the other four, so I doubt their claims. I expect that they heard Apple's 50% claims, expected six GPU cores as the rumors originally indicated, and saw cores that were not there.
Which brings us back to the question of, "So what is the 50% increase in performance that Apple claims?" Unless they had a significant increase in clock rate, I still wonder if Apple is claiming that their increase in graphics performance will come from the Metal API even though it is not exclusive to new hardware.
But from everything we saw so far, it is just a handful of percent better.