Xbox One announced, the games: not so much.

Subject: Editorial, General Tech, Graphics Cards, Processors, Systems | May 21, 2013 - 05:26 PM |
Tagged: xbox one, xbox

xbox-one-head.jpg

Almost exactly three months have passed since Sony announced the Playstation 4 and just three weeks remain until E3. Ahead of the event, Microsoft unveiled their new Xbox console: The Xbox One. Being so close to E3, they are saving the majority of games until that time. For now, it is the box itself as well as its non-gaming functionality.

First and foremost, the raw specifications:

  • AMD APU (5 billion transistors, 8 core, on-die eSRAM)
  • 8GB RAM
  • 500GB Storage, Bluray reader
  • USB 3.0, 802.11n, HDMI out, HDMI in

The hardware is a definite win for AMD. The Xbox One is based upon an APU which is quite comparable to what the PS4 will offer. Unlike previous generations, there will not be too much differentiation based on available performance; I would not expect to see much of a fork in terms of splitscreen and other performance-sensitive features.

xbox-one-controller.jpg

A new version of the Kinect sensor will also be present with all units which developers can depend upon. Technically speaking, the camera is higher resolution and more wide-angle; up to six skeletons can be tracked with joints able to rotate rather than just hinge. Microsoft is finally also permitting developers to use the Kinect along with a standard controller to, as they imagine, allow a user to raise their controller to block with a shield. That is the hope, but near the launch of the original Kinect, Microsoft filed a patent to allow sign language recognition: has not happened yet. Who knows whether the device will be successfully integrated into gaming applications.

Of course Microsoft is known most for system software, and the Xbox runs three lightweight operating environments. In Windows 8, you have the Modern interface which runs WinRT applications and you have the desktop app which is x86 compatible.

The Xbox One borrows more than a little from this model.

The home screen, which I am tempted to call the Start Screen, for the console has a very familiar tiled interface. They are not identical to Windows but they are definitely consistent. This interface allows for access to Internet Explorer and an assortment of apps. These apps can be pinned to the side of the screen, identical to Windows 8 modern app. I am expecting there to be "a lot of crossover" (to say the least) between this and the Windows Store; I would not be surprised if it is basically the same API. This works both when viewing entertainment content as well as within a game.

Xbox_Home_UI_EN_US_Male_SS.jpg

These three operating systems run at the same time. The main operating system is basically a Hyper-V environment which runs the two other operating systems simultaneously in sort-of virtual machines. These operating systems can be layered with low latency, since all you are doing is compositing them in a different order.

Lastly, they made reference to Xbox Live, go figure. Microsoft is seriously increasing their server capacity and expects developers to utilize Azure infrastructure to offload "latency-insensitive" computation for games. While Microsoft promises that you can play games offline, this obviously does not apply to features (or whole games) which rely upon the back-end infrastructure.

xbox-one-live.jpg

And yes, I know you will all beat up on me if I do not mention the SimCity debacle. Maxis claimed that much of the game requires an online connection due to the complicated server requirements; after a crack allowed offline functionality, it was clear that the game mostly operates fine on a local client. How much will the Xbox Live cloud service offload? Who knows, but that is at least their official word.

Now to tie up some loose ends. The Xbox One will not be backwards compatible with Xbox 360 games although that is no surprise. Also, Microsoft says they are allowing users to resell and lend games. That said, games will be installed and not require the disc, from what I have heard. Apart from the concerns about how much you can run on a single 500GB drive, once the game is installed rumor has it that if you load it elsewhere (the rumor is even more unclear about whether "elsewhere" counts accounts or machines) you will need to pay a fee to Microsoft. In other words? Basically not a used game.

Well, that has it. You can be sure we will add more as information comes forth. Comment away!

Source: Xbox.com

HP SlateBook x2: Tegra 4 on Android 4.2.2 in August

Subject: General Tech, Graphics Cards, Processors, Mobile | May 15, 2013 - 09:02 PM |
Tagged: tegra 4, hp, tablets

Sentences containing the words "Hewlett-Packard" and "tablet" can end in a question mark, an exclamation mark, or a period on occasion. The gigantic multinational technology company tried to own a whole mobile operating system with their purchase of Palm and abandoned those plans just as abruptly with such a successful $99 liquidation of $500 tablets, go figure, that they to some extent did it twice. The operating system was open sourced and at some point LG swooped in and bought it, minus patents, for use in Smart TVs.

So how about that Android?

HP-slatex2-01.jpg

The floodgates are open on Tegra 4 with HP announcing their SlateBook x2 hybrid tablet just a single day after NVIDIA's SHIELD move out of the projects. The SlateBook x2 uses the Tegra 4 processor to power Android 4.2.2 Jellybean along with the full Google experience including the Google Play store. Along with Google Play, the SlateBook and its Tegra 4 processor are also allowed in TegraZone and NVIDIA's mobile gaming ecosystem.

As for the device itself, it is a 10.1" Android tablet which can dock into a keyboard for extended battery life, I/O ports, and well, a hardware keyboard. You are able to attach this tablet to a TV via HDMI along with the typical USB 2.0, combo audio jack, and a full-sized SD card slot; which half any given port is available through is anyone's guess, however. Wirelessly, you have WiFi a/b/g/n and some unspecified version of Bluetooth.

HP-slatex2-02.jpg

The raw specifications list follows:

  • NVIDIA Tegra 4 SoC
    • ARM Cortex A15 quad core @ 1.8 GHz
    • 72 "Core" GeForce GPU @ ~672MHz, 96 GFLOPS
  • 2GB DDR3L RAM ("Starts at", maybe more upon customization?)
  • 64GB eMMC SSD
  • 1920x1200 10.1" touch-enabled IPS display
  • HDMI output
  • 1080p rear camera, 720p front camera with integrated microphone
  • 802.11a/b/g/n + Bluetooth (4.0??)
  • Combo audio jack, USB 2.0, SD Card reader
  • Android 4.2.2 w/ Full Google and TegraZone experiences.

If this excites you, then you only have to wait until some point in August; you will also, of course, need to wait until you save up about $479.99 plus tax and shipping.

Source: HP

Haswell Laptop specs! NEC LaVie L to launch in Japan

Subject: General Tech, Graphics Cards, Processors, Systems, Mobile | May 14, 2013 - 03:54 PM |
Tagged: haswell, nec

While we are not sure when it will be released or whether it will be available for North America, we have found a Haswell laptop. Actually, NEC will release two products in this lineup: a high end 1080p unit and a lower end 1366x768 model. Unfortuantely, the article is in Japanese.

nec_haswell_01.jpg

IPS displays have really wide viewing angles, even top and bottom.

NEC is known for their higher-end monitors; most people equate the Dell Ultrasharp panels with professional photo and video production, but their top end offers are ofter a tier below the best from companies like NEC and Eizo. The laptops we are discussing today both contain touch-enabled IPS panels with apparently double the contrast ratio of what NEC considers standard. While these may or may not be the tip-top NEC offerings, they should at least be putting in decent screens.

Obviously the headliner for us is the introduction of Haswell. While we do not know exactly which product NEC decided to embed, we do know that they are relying upon it for their graphics performance. With the aforementioned higher-end displays, it seems likely that NEC is intending this device for the professional market. A price-tag of 190000 yen (just under $1900 USD) for the lower end and 200000 yen (just under $2000 USD) for the higher end further suggests this is their target demographic.

nec_haswell_02.jpg

Clearly a Japanese model.

The professional market does not exactly have huge requirements for graphics performance, but to explicitly see NEC trust Intel for their GPU performance is an interesting twist. Intel HD 4000 has been nibbling, to say the least, on the discrete GPU marketshare in laptops. I would expect this laptop would contain one of the BGA-based parts, which are soldered onto the motherboard, for the added graphics performance.

As a final note, the higher-end model will also contain a draft 802.11ac antenna. It is expected that network performance could be up to 867 megabits as a result.

Of course I could not get away without publishing the raw specifications:

LL850/MS (Price: 200000 yen):

  • Fourth-generation Intel Core processor with onboard video
  • 8GB DDR3 RAM
  • 1TB HDD w/ 32GB SSD caching
  • BDXL (100-128GB BluRay disc) drive
  • IEEE 802.11ac WiFi adapter, Bluetooth 4.0
  • SDXC, Gigabit Ethernet, HDMI, USB3.0, 2x2W stereo Yamaha speakers
  • 1080p IPS display with touch support
  • Office Home and Business 2013 preinstalled?

LL750/MS (Price: 190000 yen):

  • Fourth-generation Intel Core processor with onboard video
  • 8GB DDR3 RAM
  • 1TB HDD (no SSD cache)
  • (Optical disc support not mentioned)
  • IEEE 802.11a/b/g/n WiFi adapter, Bluetooth 4.0
  • SDXC, Gigabit Ethernet, HDMI, USB3.0, 2x2W stereo Yamaha speakers
  • 1366x768 (IPS?) touch-enabled display

Corsair has, well, Haswell PSU support chart

Subject: Editorial, General Tech, Cases and Cooling, Processors | May 10, 2013 - 04:23 PM |
Tagged: c6, c7, haswell, PSU, corsair

I cannot do it captain! I don't have the not enough power!

We have been discussing the ultra-low power state of Haswell processors for a little over a week and how it could be detrimental to certain power supplies. Power supply manufacturers never quite expected that you could have as little as a 0.05 Amp (0.6W) draw on the 12V rail without being off. Since then, companies such as Enermax started to list power supplies which have been tested and are compliant with the new power requirements.

PSU Series Model Haswell
Compatibility
Comment
AXi AX1200i Yes 100% Compatible with Haswell CPUs
AX860i Yes 100% Compatible with Haswell CPUs
AX760i Yes 100% Compatible with Haswell CPUs
AX AX1200 Yes 100% Compatible with Haswell CPUs
AX860 Yes 100% Compatible with Haswell CPUs
AX850 Yes 100% Compatible with Haswell CPUs
AX760 Yes 100% Compatible with Haswell CPUs
AX750 Yes 100% Compatible with Haswell CPUs
AX650 Yes 100% Compatible with Haswell CPUs
HX HX1050 Yes 100% Compatible with Haswell CPUs
HX850 Yes 100% Compatible with Haswell CPUs
HX750 Yes 100% Compatible with Haswell CPUs
HX650 Yes 100% Compatible with Haswell CPUs
TX-M TX850M Yes 100% Compatible with Haswell CPUs
TX750M Yes 100% Compatible with Haswell CPUs
TX650M Yes 100% Compatible with Haswell CPUs
TX TX850 Yes 100% Compatible with Haswell CPUs
TX750 Yes 100% Compatible with Haswell CPUs
TX650 Yes 100% Compatible with Haswell CPUs
GS GS800 Yes 100% Compatible with Haswell CPUs
GS700 Yes 100% Compatible with Haswell CPUs
GS600 Yes 100% Compatible with Haswell CPUs
CX-M CX750M Yes 100% Compatible with Haswell CPUs
CX600M TBD Likely compatible — currently validating
CX500M TBD Likely compatible — currently validating
CX430M TBD Likely compatible — currently validating
CX CX750 Yes 100% Compatible with Haswell CPUs
CX600 TBD Likely compatible — currently validating
CX500 TBD Likely compatible — currently validating
CX430 TBD Likely compatible — currently validating
VS VS650 TBD Likely compatible — currently validating
VS550 TBD Likely compatible — currently validating
VS450 TBD Likely compatible — currently validating
VS350 TBD Likely compatible — currently validating

Above is Corsair's slightly incomplete chart as of the time it was copied from their website, 3:30pm on May 10th, 2013; so far it is coming up all good. Their blog should be updated as new products get validated for the new C6 and C7 CPU sleep states.

The best part of this story is just how odd it is given the race to arc-welding (it's not a podcast so you can't Bingo! hahaha!) supplies we have been experiencing over the last several years. Simply put, some companies never thought that component manufacturers such as Intel would race to the bottom of power draws.

Source: Corsair

AMD to erupt Volcanic Islands GPUs as early as Q4 2013?

Subject: Editorial, General Tech, Graphics Cards, Processors | May 8, 2013 - 09:32 PM |
Tagged: Volcanic Islands, radeon, ps4, amd

So the Southern Islands might not be entirely stable throughout 2013 as we originally reported; seismic activity being analyzed suggests the eruption of a new GPU micro-architecture as early as Q4. These Volcanic Islands, as they have been codenamed, should explode onto the scene opposing NVIDIA's GeForce GTX 700-series products.

It is times like these where GPGPU-based seismic computation becomes useful.

The rumor is based upon a source which leaked a fragment of a slide outlining the processor in block diagram form and specifications of its alleged flagship chip, "Hawaii". Of primary note, Volcanic Islands is rumored to be organized with both Serial Processing Modules (SPMs) and a Parallel Compute Module (PCM).

Radeon9000.jpg

So apparently a discrete GPU can have serial processing units embedded on it now.

Heterogeneous Systems Architecture (HSA) is a set of initiatives to bridge the gap between massively parallel workloads and branching logic tasks. We usually make reference to this in terms of APUs and bringing parallel-optimized hardware to the CPU. In this case, we are discussing it in terms of bringing serial processing to the discrete GPU. According to the diagram, the chip within would contain 8 processor modules each with two processing cores and an FPU for a total of 16 cores. There does not seem to be any definite identification whether these cores would be based upon their license to produce x86 processors or their other license to produce ARM processors. Unlike an APU, this is heavily skewed towards parallel computation rather than a relatively even balance between CPU, GPU, and chipset features.

Now of course, why would they do that? Graphics processors can do branching logic but it tends to sharply cut performance. With an architecture such as this, a programmer might be able to more efficiently switch between parallel and branching logic tasks without doing an expensive switch across the motherboard and PCIe bus between devices. Josh Walrath suggested a server containing these as essentially add-in card computers. For gamers, this might help out with workloads such as AI which is awkwardly split between branching logic and massively parallel visibility and path-finding tasks. Josh seems skeptical about this until HSA becomes further adopted, however.

Still, there is a reason why they are implementing this now. I wonder, if the SPMs are based upon simple x86 cores, how the PS4 will influence PC gaming. Technically, a Volcanic Island GPU would be an oversized PS4 within an add-in card. This could give AMD an edge, particularly in games ported to the PC from the Playstation.

This chip, Hawaii, is rumored to have the following specifications:

  • 4096 stream processors
  • 16 serial processor cores on 8 modules
  • 4 geometry engines
  • 256 TMUs
  • 64 ROPs
  • 512-bit GDDR5 memory interface, much like the PS4.
  • 20 nm Gate-Last silicon fab process
    • Unclear if TSMC or "Common Platform" (IBM/Samsung/GLOBALFOUNDRIES)

Softpedia is also reporting on this leak. Their addition claims that the GPU will be designed on a 20nm Gate-Last fabrication process. While gate-last is considered to be not worth the extra effort in production, Fully Depleted Silicon On Insulator (FD-SOI) is apparently "amazing" on gate-last at 28nm and smaller fabrication. This could mean that AMD is eying that technology and making this design with intent of switching to an FD-SOI process, without a large redesign which an initially easier gate-first production would require.

Well that is a lot to process... so I will leave you with an open question for our viewers: what do you think AMD has planned with this architecture, and what do you like and/or dislike about what your speculation would mean?

Source: TechPowerUp

Intel plans a new Atom every year, starting with Silvermont

Subject: General Tech, Processors | May 6, 2013 - 02:34 PM |
Tagged: silvermont, merrifield, Intel, Bay Trail, atom

The news today is all about shrinking the Atom, both in process size and power consumption.  Indeed The Tech Report heard talk of milliwatts and SoC's which shows the change of strategy Intel is having with Atom from small footprint HTPCs to POS and other ultra-low power applications.  Hyperthreading has been dropped and Out of Order processing has been brought in which makes far more sense for the new niche Atom is destined for. 

Make sure to check out Ryan's report here as well.

TR_core-block.png

"Since their debut five years ago, Intel's Atom microprocessors have relied on the same basic CPU core. Next-gen Atoms will be based on the all-new Silvermont core, and we've taken a closer look at its underlying architecture."

Here is some more Tech News from around the web:

Tech Talk

Author:
Subject: Processors, Mobile
Manufacturer: Intel

A much needed architecture shift

It has been almost exactly five years since the release of the first Atom branded processors from Intel, starting with the Atom 230 and 330 based on the Diamondville design.  Built for netbooks and nettops at the time, the Atom chips were a reaction to a unique market that the company had not planned for.  While the early Atoms were great sellers, they were universally criticized by the media for slow performance and sub-par user experiences. 

Atom has seen numerous refreshes since 2008, but they were all modifications of the simplistic, in-order architecture that was launched initially.  With today's official release of the Silvermont architecture, the Atom processors see their first complete redesign from the ground up.  With the focus on tablets and phones rather than netbooks, can Intel finally find a foothold in the growing markets dominated by ARM partners? 

I should note that even though we are seeing the architectural reveal today, Intel doesn't plan on having shipping parts until late in 2013 for embedded, server and tablets and not until 2014 for smartphones.  Why the early reveal on the design then?  I think that pressure from ARM's designs (Krait, Exynos) as well as the upcoming release of AMD's own Kabini is forcing Intel's hand a bit.  Certainly they don't want to be perceived as having fallen behind and getting news about the potential benefits of their own x86 option out in the public will help.

silvermont26.jpg

Silvermont will be the first Atom processor built on the 22nm process, leaving the 32nm designs of Saltwell behind it.  This also marks the beginning of a new change in the Atom design process, to adopt the tick/tock model we have seen on Intel's consumer desktop and notebook parts.  At the next node drop of 14nm, we'll see see an annual cadence that first focuses on the node change, then an architecture change at the same node. 

By keeping Atom on the same process technology as Core (Ivy Bridge, Haswell, etc), Intel can put more of a focus on the power capabilities of their manufacturing.

Continue reading about the new Intel Silvermont architecture for tablets and phones!!

Overclocker Pushes An Intel Haswell Core i7-4770K CPU Beyond 7GHz

Subject: Processors | May 3, 2013 - 06:45 AM |
Tagged: z87, overclocking, Intel, haswell, core i7 4770k, 7ghz

OCaholic has spotted an interesting entry in the CPU-Z database. According to the site, an overclocker by the handle of “rtiueuiurei” has allegedly managed to push an engineering sample of Intel’s upcoming Haswell Core i7-4770K processor past 7GHz.

Intel Core i7-4770K Overclocked Beyond 7GHz.jpg

If the CPU-Z entry is accurate, the overclocker used a BCLK speed of 91.01 and a multiplier of 77 to achieve a CPU clockspeed of 7012.65MHz. The chip was overclocked on a Z87 motherboard along with a single 2GB G.Skill DDR3 RAM module. Even more surprising than the 7GHz clockspeed is the voltage that the overclocker used to get there: an astounding 2.56V according to CPU-Z.

From the information Intel provided at IDF Beijing, the new 22nm Haswell processors feature an integrated voltage regulator (IVR), and the CPU portion of the chip’s voltage is controlled by the Vccin value. Intel recommends a range of 1.8V to 2.3V for this value, with a maximum of 3V and a default of 1.8V. Therefore, the CPU-Z-reported number may actually be correct. On the other hand, it may also just be a bug in the software due to the unreleased-nature of the Haswell chip.

Voltage questions aside, the frequency alone makes for an impressive overclock, and it seems that the upcoming chips will have decent overclocking potential!

Source: OCaholic
Author:
Manufacturer: Intel

The Intel HD Graphics are joined by Iris

Intel gets a bad wrap on the graphics front.  Much of it is warranted but a lot of it is really just poor marketing about the technologies and features they implement and improve on.  When AMD or NVIDIA update a driver or fix a bug or bring a new gaming feature to the table, they are sure that every single PC hardware based website knows about and thus, that as many PC gamers as possible know about it.  The same cannot be said about Intel though - they are much more understated when it comes to trumpeting their own horn.  Maybe that's because they are afraid of being called out on some aspects or that they have a little bit of performance envy compared to the discrete options on the market. 

Today might be the start of something new from the company though - a bigger focus on the graphics technology in Intel processors.  More than a month before the official unveiling of the Haswell processors publicly, Intel is opening up about SOME of the changes coming to the Haswell-based graphics products. 

We first learned about the changes to Intel's Haswell graphics architecture way back in September of 2012 at the Intel Developer Forum.  It was revealed then that the GT3 design would essentially double theoretical output over the currently existing GT2 design found in Ivy Bridge.  GT2 will continue to exist (though slightly updated) on Haswell and only some versions of Haswell will actually see updates to the higher-performing GT3 options.  

01.jpg

In 2009 Intel announced a drive to increase graphics performance generation to generation at an exceptional level.  Not long after they released the Sandy Bridge CPU and the most significant performance increase in processor graphics ever.  Ivy Bridge followed after with a nice increase in graphics capability but not nearly as dramatic as the SNB jump.  Now, according to this graphic, the graphics capability of Haswell will be as much as 75x better than the chipset-based graphics from 2006.  The real question is what variants of Haswell will have that performance level...

02.jpg

I should note right away that even though we are showing you general performance data on graphics, we still don't have all the details on what SKUs will have what features on the mobile and desktop lineups.  Intel appears to be trying to give us as much information as possible without really giving us any information. 

Read more on Haswell's new graphics core here.

Possible power supply issues for Intel Haswell CPUs

Subject: Cases and Cooling, Processors | May 1, 2013 - 03:07 PM |
Tagged: power supply, Intel, idle, haswell, c7, c6

I came across an interesting news story posted by The Tech Report this morning that dives into the possibility of problems with Intel's upcoming Haswell processors and currently available power supplies.  Apparently, the new C6 and C7 idle power states that give the new Haswell architecture benefits for low power scenarios place a requirement of receiving a 0.05 amps load on the 12V2 rail.  (That's just 50 milliamps!)  Without that capability, the system can exhibit unstable behavior and a quick look at the power supply selector on Intel's own website is only listing a couple dozen that support the feature. 

haswellpsu.jpg

This table from VR-Zone, the source of the information initially, shows the difference between the requirements for 3rd (Ivy Bridge) and 4th generation (Haswell) processors.  The shift is an order of magnitude and is quite a dramatic change for PSU vendors.  Users of Corsair power supplies will be glad to know that among those listed with support on the Intel website linked above were mostly Corsair units!

A potential side effect of this problem might be that motherboard vendors simply disable those sleep states by default.  I don't imagine that will be a problem for PC builders anyway since most desktop users aren't really worried about the extremely small differences in power consumption they offer.  For mobile users and upcoming Haswell notebook designs the increase in battery life is crucial though and Intel has surely been monitoring those power supplies closely. 

I asked our in-house power supply guru, Lee Garbutt, who is responsible for all of the awesome power supply reviews on pcper.com, what he thought about this issue.  He thinks the reason more power supplies don't support it already is for power efficiency concerns:

Most all PSUs have traditionally required "some load" on the various outputs to attain good voltage regulation and/or not shut down. Not very many PSUs are designed yet to operate with no load, especially on the critical +12V output. One of the reasons for this is efficiency. Its harder to design a PSU to operate correctly with a very low load AND to deliver high efficiency. It would be easy just to add some bleed resistance across the DC outputs to always have a minimal load to keep voltage regulation under control but then that lowers efficiency.

Source: Tech Report