Wait, copy/pasting levels is a bad thing? Dragons Age 3 reputed to be much less like Halo

Subject: General Tech | April 11, 2012 - 04:19 PM |
Tagged: gaming, dragon age 3, BioWare

If you can believe the hints that BioWare is dropping, Dragon Age 3 should feel a lot less like a very expensive, poorly designed add on and more like an actual new game.  Of course, this comes from a very quick talk given at PAX about an unannounced game which might or might not be in development, so your sodium intake for the day should be adjusted to compensate.  They also implied that your choices would Effect the ending of the game ... now where have we heard that before?

RPS_dat1.jpg

"Rumours are flying that Dragon Age 3 might be something more like the sequel to Dragon Age we’ve been hoping for. After Dragon Age 2 came out feeling more like a side-project, BioWare have dropped some hefty hints that they’re looking to redress much of that in an unannounced third game for the series. At a PAX East panel, as spotted by Eurogamer and recorded by Gamespot, Dragon Age developers discussed what a hypothetical game might contain, were it to exist, which it currently doesn’t, but obviously does. It’s to be a far more varied game, with new locales, and decisions that carry over from previous games."

Here is some more Tech News from around the web:

Gaming

 

Seems that CPU-less video streaming is not NP-Hard

Subject: General Tech | April 11, 2012 - 02:33 PM |
Tagged: toshiba, NPEngine, video streaming

The NPEngine from Toshiba has a big future for content providers who would be interested in streaming thousands of concurrent videos and save on space and power requirements.  The quote from the article at The Register claims a 70% reduction in size and a 77% reduction in power consumption for a provider who could provide 100,000 separate streams.  That savings demonstrates the benefit of purpose built hardware, what it lacks in versatility it more than makes up for in savings.  The server version is due out this year, with the possibility of a single chip version for laptops and SFF machines which will take the CPU completely out of the picture when playing HD video.

elreg_toshiba_npengine.jpg

"Toshiba's NPEngine hardware directly streams video from SSDs to IP networks without using host server CPU cycles or memory.

Tosh claims the dedicated hardware delivers up to 64,000 x 40Gbit/sec video streams – way more than the 20,000 or so an average 2U server is said to be able to stream. The Toshiba hardware, a server card, can replace at least two video-streaming servers and enable its host server to do other work."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Register

Epic talks a storm of bullets. Piracy hurt the sequel?

Subject: Editorial, General Tech | April 10, 2012 - 10:45 PM |
Tagged: piracy, epic games, bulletstorm

Mike Capps of Epic Games, among many other developers and publishers, completely misses the point about piracy. No-one can control piracy, they can only control factors which influence it -- but controlling those factors is meaningless if sales are sacrificed in the process. No-one gets paid by not being pirated; people get paid by making sales.

Frequent readers of my editorials are probably well aware that I am quite vocal about many topics including piracy, the consumables model of art, censorship, and used content sales. I take a very mathematical approach to a lot of complicated topics. Unfortunately, a lot of what is considered common truths is based on fundamentally invalid statistics. It gives me a lot to write about.

Mike Capps of Epic Games was interviewed by GameSpot during PAX East and at some point in the discussion the topic floated across Bulletstorm. On the topic of its lower-than-expected sales, Capps added that the PC version was adversely affected by piracy.

bulletstorm.png

Piracy gnashing its teeth?

Similar statements have been made for countless other games at countless other times. Each of those statements makes a subtle but gigantic mistake in formulating the problem: piracy is not something which does, piracy is something which is. Piracy does not affect your sales, but whatever affected piracy might also affect sales in one way or another.

The intuition is that sales decrease as piracy increases and vice versa. That assumption is nullified by counter-example: do not release a product. Piracy and sales, if you do not release a game, will trend in the same direction: to zero. It is now obvious that sales and piracy do not always inversely correlate.

As Mike Capps also stated in the interview, Bulletstorm had a very rough launch and lifespan on the PC. Bulletstorm required for Games for Windows Live, encrypted its settings, and did other things to earn a reputation since launch as a bad console port to the PC. Customers complained about the experience on the PC which fueled an inferno of uncertainty and doubt for potential buyers.

Being pirated is not losing a sale, but losing a customer before their purchase is.

I was personally on the fence about Bulletstorm and this negative word-of-mouth lead me to ignore the title. I did not purchase the game, I did not pirate the game; I ignored the game. Perhaps those who pirated your title did so because they were interested, became discouraged, but were not discouraged enough to avoid giving it a chance with piracy?

What I am saying is -- piracy cannot reduce your sales (it cannot do anything, it is a measurement), but perhaps whatever combination of factors reduced your sales may also have increased your piracy?

Piracy is an important measurement to consider -- but it, like sales, is just that, a measurement, nothing more. Strive to increase your sales -- keep an eye on your piracy figures to learn valuable information -- but always exclusively strive to increase your sales. It is the measurement that will pay your bills.

Source: GameSpot

More leaks about NVIDIA's Dual-GPU GTX 690? May be?

Subject: General Tech, Graphics Cards | April 10, 2012 - 07:18 PM |
Tagged: nvidia, leak, GTX 690

More information has surfaced about NVIDIA’s GeForce GTX 690 video card. While other tidbits came to light, perhaps most interesting is the expected May release.

NVIDIA has suffered from quite a few leaks near their launch of the GeForce GTX 680 GPU and its associated cards. Benchmarks were accidentally published early and product pages were mistakenly posted premature. The hot streak continues.

10-nv_logo.png

It may be time to just reset fate and skip to the GTX 700-series. I mean they will eventually be rebranded 700-something anyway.

I kid, I kid.

Not many specifications were leaked, although there is not much left that cannot already be assumed about the card due to the similarities with its sister part.

The reference model GTX 690 will require two 8-pin power connectors and output via three DVI ports as well as a mini DisplayPort. The already released GTX 680, by contrast, requires two 6-pin connectors and outputs by two DVI, an HDMI, and a full size DisplayPort.

The new card will require more power for its dual GK104 GPUs as the larger power connectors would suggest. While the GTX 680 is happy with 550W of total system power, the GTX 690 would like a system power supply of at least 650W. Since the 680 is expected to draw a maximum of 195W, an extra 100W would put estimates for the 690 power draw at somewhere around 295W.

Unfortunately estimates based on rated total system power are very inaccurate as power supply requirements are often raised to the nearest 50W. Really, the 690 could be anywhere between 245W and 295W and even those figures are just estimates.

Still, it looks as though my 750W power supply will survive past May when the leak claims that the GTX 690 is expected to arrive. Yay! May!

Source: EXPreview

Composite Copper and Graphene to make a cool couple.

Subject: General Tech, Cases and Cooling | April 10, 2012 - 06:03 PM |
Tagged: graphene, cooling

Researchers at NC State University have tested the heat dissipation properties of copper-graphene. Their findings suggest that the material could be cheaper and more effective than pure copper.

Some people have gone to ridiculous lengths to cool their components. Some people flush their coolant regularly. Some people will never live down mineral oil jokes. No two computers are not on fire. Awwww.

Copper is regularly used as a method of dissipating heat as it is highly efficient when sufficiently pure. While copper is expensive, it is not expensive enough to be prohibitive for current use. Alternatives are still being explored and a researcher at NC State University believes graphene might be part of the answer.

bathroomsuctionfan.jpg

Some people stick a bathroom suction fan out a window and run a 3” drier hose into their case.

As always, I become immediately skeptical when a team of researchers make a claim such as this. Whether or not these issues are valid have yet to be seen, but they come to mind none-the-less. The paper claims that the usage is designed for power amplifiers and laser diodes.

My first concern is with geometry. Effective cooling is achieved by exposing as much surface area between two materials as is possible for the situation. Higher heat conductance allows heat to get away much more efficiently, but the heat still needs to be removed to a reservoir of some sort, such as your room. There has not been much talk about the possibilities to then remove the heat after copper-graphene so efficiently sucks from the heat source.

My second concern is with the second layer of indium-graphene. While it seems as though the amount of indium required is quite small -- just a single layer between the heat source and the copper-graphene -- we do not really know for certain how that relates to real world applications. Indium is still a very rare element which is heavily mined for touch screen devices. It might prove to be cheap, but there is only so much of it. Would we also be able to reclaim the Indium later, or will it end up in a landfill?

These concerns are probably quite minor but it is generally good practice to not get too excited when you see a research paper. Two points if you see any of the following: Nano, Graphene or Carbon Nanotubes, Lasers, and anything related to High-Frequency.

Take a pictorial tour of Kingston's SSD facility

Subject: General Tech | April 10, 2012 - 11:48 AM |
Tagged: kingston, fab, tour, ssd

Tweaktown was invited put on a bunny suit and take a tour of Kingston's SSD manufacturing facility in Taiwan.  Starting from a pile of surface mount transistors which are automatically soldered and inspected before being baked at up to 270C once all the components have been mounted to the PCB, they snapped pictures of as much of the process as they could.  From there it is off to the testing facility where Kingston ensures that all the drives that came off of a particular run are up to the expected standards.  TweakTown does mention a burn-in machine, but unfortunately they were told not to post them as Kingston wanted to keep at least a few trade secrets from getting out.  It could also be that they don't want the world to know that they cloned Al several times and use his SSD killing expertise as the final test before releasing a drive to the channel to be sold.

TT_SSD.jpg

"We were exclusively invited into the Kingston factory where few media have been and got shown the process of making an SSD from start to finish. Due to media restrictions, we were not allowed to produce a video of the tour, but we were allowed to take photos. Obviously Kingston is a market leader in memory and SSD products and there is plenty of sensitive machinery and such - and we needed to respect that and their rules."

Here is some more Tech News from around the web:

Tech Talk

 

Source: Tweaktown

Raspberry Pi Computers Pass EMC Compliance Testing

Subject: General Tech, Systems | April 8, 2012 - 08:38 PM |
Tagged: Raspberry Pi, pcb, emc test, computer, compliance testing, arm

The highly anticipated Raspberry Pi ARM computer has run into several launch hiccups, the most recent being that the distributors -- RS and Farnell -- refused to sell and ship the devices without the Raspberry Pi passing the proper electromagnetic interference testing. While such certification is not required for Arduino or Beagle Boards, the companies stated that because the Raspberry Pi was (more) likely to be used as a final consumer product (and not a development board) it needed to obtain and pass EMC testing to ensure that it would not interfere with (or be interfered by) other electronic devices.

According to a recent blog post by the charity behind the ARM powered Linux computer, the Raspberry Pi has passed the EMC compliance testing with flying colors -- after a few hiccups with a network hub used to test the Raspberry Pi while it was being hit with an EM field were sorted out.

Raspberry-Pi.jpg

The team has been working out of Panasonic’s facility in South Wales to test the Raspberry Pi. Due to having the lab area for a whole week, they managed to knock out consumer product inference testing for several other countries as well. Mainly, the Raspberry Pi is now compliant with the UK CE requirements, the United States’ FCC, Australia’s CTick, and Canada’s Technical Acceptance Certificate (TAC).

Assuming the paper work is properly filed and RS and Farnell accept the certifications, the Raspberry Pi units should begin winging their way to customers shortly. Are you still waiting on your Raspberry Pi, and if so have you decided what you intend to use it for yet?

If you are interested in the Raspberry Pi, be sure to check out some of our other coverage of the little ARM computer!

NVIDIA Tegra 4 Specifications Sheet Leaks

Subject: General Tech, Mobile | April 7, 2012 - 07:11 PM |
Tagged: tegra 4, tegra, SoC, nvidia, mobile

The Chinese language VR-Zone website has allegedly managed to get their hands on a leaked specifications sheet for NVIDIA’s upcoming Tegra 4 System-on-a-chip (SoC) aimed at mobile tablets. Codenamed “Wayne,” the new SoC will come in several flavors and will arrive next year.

The upcoming chips will have 10x the performance of NVIDIA’s original Tegra and five times the performance of the current generation Kal-El Tegra 3 chip. NVIDIA has run into several hurdles in integrating an LTE cell radio into their SoCs, but if the leaked document is true, the company will finally release a Tegra chp with built-in LTE 100 and HSPA42 cell radio capabilities as early as the third quarter of 2013.

Badge_Tegra_3D_large.jpg

Further, the Tegra 4 SoCs will come in four flavors: T40, T43, AP40, and SP3X. T40 will represent the first Tegra 4 chp that manufacturers and consumers will be able to get their hands on -- as early as Q1 2013. It is a quad core part with one companion core and will run at 1.8 GHz. T43 is an evolution of the T40 and will bump up the clockspeed to 2.0 GHz. The AP40 chip will be the first budget Tegra 4 processor and will run anywhere between 1.2 GHz and 1.8 GHz. The T43 and AP40 SoCs are reportedly coming out in Q3 2013. All three chips -- The T40, T43, and AP40 -- are based on the ARM Cortex A15 architecture.

  T40 T43 AP40 SP3X
Release Date Q1 2013 Q3 2013 Q3 2013 Q3 2013
Markets Aimed At Flagship Flagship Mainstream Mainstream
Tablet Device Screen Size 10" 10" 10" 7"
Processor Clockspeed 1.8 GHz 2.0 GHz 1.2-1.8 GHz 1.2-2.0 GHz
Core Count 4+1 4+1 4+1 4+1
Chip Architecture A15 A15 A15 A9
Cell Radio       LTE100/HSPA42

 

The final Tegra 4 chip is called SP3X, and it will arrive in Q3 2013. Aimed at mainstream tablets with 7” or smaller screens, the upcoming SoC will feature LTE support and will have a clockspeed of 1.2 GHz to 2.0 GHz. It is a quad core (plus one companion core) part but is reportedly based on the ARM Cortex A9 architecture.  The leaked release dates do seem to be in line with earlier reports, though they should still be taken with your daily dose of salt.

 

Right now Tegra delivers on performance and many high end mobile devices have incorporated the NVIDIA chip. Even so, they still have very little market share, and the two mainstream Tegra 4 chips -- especially the SP3X with LTE radio -- should help them make inroads against Qualcomm and Samsung who hold a great deal of market share.

Blizzard stimming eSports. Will it lose 10 or 20 health?

Subject: General Tech | April 6, 2012 - 04:16 AM |
Tagged: starcraft 2, blizzard

Blizzard announces the Starcraft II World Championship Series. Tournaments will be held by qualifier region, country, continent, and world-wide. Winning a stage qualifies you for the next stage leading to a single global winner.

There are an astonishing number of tournaments for Starcraft II compared to almost any other strategy game. The game was made famous by its promotion of both the macrogame of economy and production as well as the microgame of control and positioning. Also, each of the three races balances each other by being entirely different, rather than in spite of it.

Due to the many different play styles as well as the imbalance of information between players and spectators, Starcraft has become a very entertaining spectator sport.

BattleChamp.jpg

At the end of the tournament they should pummel the winners with water guns.

Yes, water guns. Blizzard would never Nerf a Terran.

At the end of the tournament, an officially Blizzard-recognized champion of Starcraft will be crowned. Currently a few handfuls of players are crowned the winner of some tournament only to be overthrown at some other tournament sponsored by some other company. While the unofficial tournaments such as MLG and GSL will obviously still continue to flourish, Blizzard seems to want to control an official result that they recognize.

It is still unclear whether the event will be recurring and at what frequency. Though, chances are, not even Blizzard knows at this point.

Participating countries are listed in their blog posting. Surprisingly, Japan is not present alongside China and South Korea. Official dates have yet to be announced except that the tournament itself is expected to run this year.

Source: Blizzard

Stardock Releases Sins: Rebellion Beta 2, Adds Advent Players

Subject: General Tech | April 6, 2012 - 03:42 AM |
Tagged:

Earlier this week, Stardock officially released the new Beta 2 update of the Sins of a Solar Empire: Rebellion 4X RTS game. Most popular for SOASE and Impulse (which has since been sold off -- they are now offering their games on Steam), Stardock has been working on the Sins standalone sequel for more than a year now. Over the past year or so, the company has push out enough teasers to drive me crazy with lust, so when the beta was finally public I jumped in with the credit card faster than The Flash.

Titan.jpg

The TEC Rebels' Titan Class Ship

Unfortunately, the first few beta updates only allowed gamers to play the TEC (Trader Emergency Coalition) and not the other two alien races. The newly released Beta 2 is now being pushed out via a Steam update and will allow gamers to play the Advent and TEC (and both sub-classes of each, Rebels and Loyalists). The game also features several bug fixes and performance improvements. Other notable changes include balance improvements, (unfortunately) the slight Titan ship nerfing, and the addition of an introduction movie sequence at the start of the game.

Sins of a Solar Empire Rebellio 2012-04-06 02-28-11-15.png

The Advent Eradica Titan Class Ship

I have not gotten a chance to play very much of the new beta 2 yet, but the first thing I noticed was an additional capital ship for the TEC Loyalists (and possibly the other classes) that is positioned as a support craft and is capable of sending out boarding parties. Pretty neat, the game overall also seems to be a bit snappier, especially when moving between menus and such. I’m more of a TEC guy than an Advent player, but I do love crushing them in battle so I’m glad to see the Advent added to the game. The only big race left is the Vasari at this point, which suggests that the game is that much closer to final release. I am cautiously optimistic that Sins: Rebellion is going to be a game worthy of my $30 bucks. Below is a video of the new introduction movie, and the full change log for the Beta 2 release can be found over at Stardock’s Forums.

Have you played the beta yet, and if so what do you make of it?

Source: Stardock