Ubisoft Discusses Assassin's Creed: Unity with Investors

Subject: General Tech, Graphics Cards | February 15, 2015 - 07:30 AM |
Tagged: ubisoft, DirectX 12, directx 11, assassins creed, assassin's creed, assasins creed unity

During a conference call with investors, analysts, and press, Yves Guillemot, CEO of Ubisoft, highlighted the issues with Assassin's Creed: Unity with an emphasis on the positive outcomes going forward. Their quarter itself was good, beating expectations and allowing them to raise full-year projections. As expected, they announced that a new Assassin's Creed game would be released at the end of the year based on the technology they created for Unity, with “lessons learned”.

ubisoft-assassins-creed-unity-scene.jpg

Before optimization, every material on every object is at least one draw call.

Of course, there are many ways to optimize... but that effort works against future titles.

After their speech, the question period revisited the topic of Assassin's Creed: Unity and how it affected current sales, how it would affect the franchise going forward, and how should they respond to that foresight (Audio Recording - The question starts at 25:20). Yves responded that they redid “100% of the engine”, which was a tremendous undertaking. “When you do that, it's painful for all the group, and everything has to be recalibrated.” He continues: “[...] but the engine has been created, and it is going to help that brand to shine in the future. It's steps that we need to take regularly so that we can constantly innovated. Those steps are sometimes painful, but they allow us to improve the overall quality of the brand, so we think this will help the brand in the long term.”

This makes a lot of sense to me. When the issues first arose, it was speculated that the engine was pushing way too many draw calls, especially for DirectX 11 PCs. At the time, I figured that Ubisoft chose Assassin's Creed: Unity to be the first title to use their new development pipeline, focused on many simple assets rather than batching things together to minimize host-to-GPU and GPU-to-host interactions. Tens of thousands of individual tasks being sent to the GPU will choke a PC, and getting it to run at all on DirectX 11 might have diverted resources from, or even caused, many of the glitches. Currently, a few thousand is ideal although “amazing developers” can raise the ceiling to about ten thousand.

This also means that I expect the next Assassin's Creed title to support DirectX 12, possibly even in the graphics API's launch window. If I am correct, Ubisoft has been preparing for it for a long time. Of course, it is possible that I am simply wrong, but it would align with Microsoft's Holiday 2015 expectation for the first, big-budget titles to use the new interface and it would be silly to have done their big overhaul without planning on switching to DX12 ASAP.

Then there is the last concern: If I am correct, what should Ubisoft have done? Is it right for them to charge full price for a title that they know will have necessary birth pains? Do they delay it and risk (or even accept) that it will be non-profitable, and upset fans that way? There does not seem to be a clear answer, with all outcomes being some flavor of damage control.

Source: GamaSutra

No one has talked about mouse mats in a while

Subject: General Tech | February 13, 2015 - 05:09 PM |
Tagged: mouse, gaming mat, input, XTracPads, Carbonic, Ripper, Ripport XXL

They are not the most glamorous of peripherals but they do save your desk and can help you with your accuracy, so pop over to Overclockers Club to take a look at XTracPads.  They offer three different sized gaming mats from the paper sized Carbonic at 8.5" x 11" x 1/8" to the Ripper at a larger 11" x 17" x 1/8" to the immense Ripper XXL at 36" x 18" x 1/8" which is going to cover a goodly piece of your desk.  They are priced at roughly $15, $22 and $35 so it is not a major investment to pick up and well worth it if you are looking to replace an old mat which has seen better days.

1_thumb.jpg

"From a casual gamer perspective, I am sure someone who can game competitively will likely notice a greater improvement than I. Personally, I have had trouble with mouse pads that were too hard, not stiff, but solid cutouts of plastic (I don't even know if they are made anymore really). I have also had issues with mouse pads that accumulate a bunch of gross after a bit of use. I can live with poor or cheap mouse pads, but now that I have had a taste of the other side I really don't want to anymore."

Here is some more Tech News from around the web:

Tech Talk

Seagate and Micron become super best friends

Subject: Storage | February 13, 2015 - 02:47 PM |
Tagged: Seagate, micron

The large storage companies have been teaming up for a while now, not simply through mergers and takeovers but also joint ventures between those who were once competitors.  It is debatable if consumers will see much cost benefit from this cooperation but at least the products do seem to improve as specialties are combined.  In this particular case we will see the traditionally disk based Seagate working with the flash memory maker Micron develop SAS products as well as SSDs for Enterprise customers.  The idea of Serial attached SCSI SSDs is certainly interesting but in the current business environment you have to wonder how many companies will have the budget to invest in large scale migrations to flash based storage.  It is far more likely this will bring new hybrid storage servers to the market, with SSDs in the front to provide bandwidth to frequently accessed data with HDD behind them for backups and cold storage.  You can get a quick refresher on the other companies which have started cooperative ventures in the article at The Inquirer.

index.png

"SEAGATE AND MICRON have announced that they will join forces to work on projects together over a number of years."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

Warner Bros. Interactive Entertainment Now on GOG

Subject: General Tech | February 12, 2015 - 03:26 PM |
Tagged: warner bros, pc gaming, GOG

Another publisher signed a deal with GOG to sell and distribute games, DRM-free. To launch their partnership with Warner Bros. Interactive Entertainment, six games have been added and five of them are on sale. LEGO Batman (50%-off), the two LEGO Harry Potter games (each 60%-off), F.E.A.R. Platinum (50%-off), and Bastion (60%-off) will be at their reduced prices all week.

Warner-Bros-Inter_logo.png

The sixth title comes from their acquisition of Midway Games and is actually a three-game combo: Mortal Kombat 1+2+3. One person in the comments said that they are DOS-based versions and controller support might be a problem (although JoyToKey should solve that problem nicely - especially for a fighting game without analog controls). The first two games only support single PC multiplayer, although Mortal Kombat 3 allows LAN. Of course, LAN support should be easily extended to online multiplayer with people that you know online via VPN software, but I have not tried it myself and lag could be a problem.

All six titles are DRM-free, because it's GOG and that's how they roll.

Source: GOG

A flagship A88X board for well under $200, remember the ASUS Crossblade Ranger

Subject: Motherboards | February 12, 2015 - 03:01 PM |
Tagged: motherboard, Kaveri, Intel Gig-E, FM2+, DDR-3 2133, crossblade ranger, audio, asus, A88X

It has been a while since Josh reviewed the ASUS Crossblade Ranger so it seems appropriate to put up a reminder that there are some impressive AMD boards out there with The Tech Report's review of the board.  This board has just about everything except an M.2 port, from the Asus SupremeFX 2014 with high end caps and EMI shielding to HDMI, DVI, and VGA display outputs to a BIOS button on the backplate which allows you to update the upgrade the motherboard's firmware without a CPU or RAM installed.  Check out the full review to get a list of the other features as well as a glimpse into the personality traits the board displayed during testing.

topdown.jpg

"Asus' Crossblade Ranger is a tweaker-friendly, top-of-the-line motherboard for AMD's Socket FM2+ processors. We kicked the tires and turned up the clocks to see whether the Ranger lives up to its top billing."

Here are some more Motherboard articles from around the web:

Motherboards

Podcast #336 - GTX 960 Overlocking, Plextor M6e Black Edition, AMD R9 3xx Rumors and more!

Subject: General Tech | February 12, 2015 - 01:46 PM |
Tagged: podcast, video, gtx 960, plextor, m6e black edition, M6e, r9 390, amd, radeon, nvidia, Silverstone, tegra, tx1, Tegra X1, corsair, H100i GTX, H80i GT

PC Perspective Podcast #336 - 02/12/2015

Join us this week as we discuss GTX 960 Overlocking, Plextor M6e Black Edition, AMD R9 3xx Rumors and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

Maxwell keeps on overclocking

Subject: Graphics Cards | February 12, 2015 - 01:41 PM |
Tagged: overclocking, nvidia, msi, gtx 960, GM206, maxwell

While Ryan was slaving over a baker's dozen of NVIDIA's GTX 960s, [H]ard|OCP focused on overclocking the MSI GeForce GTX 960 GAMING 2G that they recently reviewed.  Out of the box this GPU will hit 1366MHz in game, with memory frequency unchanged at 7GHz effective.  As users have discovered, overclocking cards with thermal protection that automatically downclocks the GPU when a certain TDP threshold has been reached is a little more tricky as simply upping the power provided to the card can raise the temperature enough that you end up with a lesser frequency that before you overvolted.  After quite a bit of experimentation, [H] managed to boost the memory to a full 8GHz and the in game GPU was hitting 1557MHz which is at the higher end of what Ryan saw.  The trick was to increase the Power Limit and turn the clock speed up but leave the voltage alone.

1421920414x2ymoRS6JM_2_4_l.jpg

"We push the new MSI GeForce GTX 960 GAMING video card to its limits of performance by overclocking to its limits. This NVIDIA GeForce GTX 960 GPU based video card has a lot of potential for hardware enthusiasts and gamers wanting more performance. We compare it with other overclocked cards to see if the GTX 960 can keep up."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Of gaps of air and hats of tinfoil

Subject: General Tech | February 12, 2015 - 12:51 PM |
Tagged: security, fud

In networking, an air gap refers to a security measure that separates a network from the public infrastructure, either physically or through the use of extremely secure tunnelling.  This prevents access to that network over the internet or less secure LANs and is used in high security locations as it is generally considered one of the best ways of securing a network.  As with all things silicon, it is not perfect and this article at The Register should not be read by the faint of heart.  They describe several methods which have been developed to overcome air gaps, thankfully most require that the attacker had been able to gain physical access to the air gapped systems to infect them from within and as you have heard many times, once an attacker can gain physical access to your systems all bets are off.

What is interesting is the ways in which the infected systems transmit the stolen data without the need for physical contact and are incredibly difficult to detect.  Some are able to use the FM frequencies generated by GPUs to send data to cellphones up to 7m away while another uses the pixels to transmit hidden data in a way that is invisible to the user of the machine.  Other attacks involve spreading infection via microphones and speakers or a thumbdrive which was attached to an air gapped machine which could transmit data over a radio frequency up to 13 kilometres away.  It is a wild world out there and even though many of the attacks described have only been done in research labs; don't let strangers fondle your equipment without consent!

KiwiconV_c_1600x1200.png

"The custom code had jumped an air gap at a defence client and infected what should have been a highly-secure computer. Sikorski's colleagues from an unnamed company plucked the malware and sent it off to FireEye's FLARE team for analysis."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

Super Mario Sunshine and Pikmin 2 at 1080p 60 FPS

Subject: General Tech | February 12, 2015 - 08:30 AM |
Tagged: reverse-consolitis, PC, Nintendo, emulator, dolphin

Update: Fixed the title of "Pikmin". Accidentally wrote "Pikman".

Considering the recent Nintendo license requirements, I expect that their demonstrative YouTube videos will have a difficult time staying online. Regardless, if you are in a jurisdiction where this is legal, it is now possible to play some Gamecube-era games at 60 FPS (as well as 1080p) with an emulator PC.

dolphin-emulator.png

The blog post at the Dolphin Emulator's website goes into the “hack” in detail. The main problem is that these games are tied to specific framerates, which will cause problems with sound processing and other, time-sensitive bits of code. I have actually been told that one of the most difficult aspects of bringing a console game to the PC (or restoring an old PC game) is touching the timing code. It will break things all over. For Super Mario Sunshine, this also involves patching the game such that certain parts are still ticked at 30 FPS, despite the render occurring at twice that rate.

Also interesting is that some games, like Super Smash Bros. Melee, did not require a game-side patch. Why? Because they apparently include a slow-motion setting by default, which was enabled and then promptly sped up to real time, resulting in a higher frame rate at normal speed. The PC is nothing if not interesting.

Xenon Flashes Make a Case for a Raspberry Pi 2 Case

Subject: General Tech, Systems | February 12, 2015 - 08:00 AM |
Tagged: raspberry pi 2, Raspberry Pi

It did not take long to find a problem with the Raspberry Pi 2. As it turns out, the Pi 2 contains a power regulator chip that is susceptible to bright sources of light. The light will force electrons to move when a metal is struck by enough photons with the correct, per-photon energy, which is its frequency/color, and that will be perceived as a voltage (because it actually does cause a voltage).

rasp-pi-failchip.jpg

In the Raspberry Pi 2, this manifests as a voltage drop and the device immediately powers down. This was first discovered by Peter Onion on the Raspberry Pi forums while he was taking photographs of his Raspberry Pi 2. He noticed that each time he snapped a photo, the Pi would shut down. Liz Upton of the Raspberry Pi Foundation promptly confirmed the issue and wrote a long blog post explaining what actually happens. She borrows Peter's joke from the forum thread, that the Pi 2 is camera shy, and explains that “everyday light sources” will not cause this to happen. She then explains the photoelectric effect, the role of the above pictured U16 chip, and the issue itself.

I definitely appreciate Liz Upton and the Raspberry Pi Foundation, founded on the premise of education, taking the time to explain their bugs from an educational standpoint. That said, it is easy to lose sight of your goal when you have a product to defend, and I am glad that it did not get in the way.

A final note: this will not damage the Pi 2, just cause it to crash and power down. The only real problem is that shutting down your device mid-task will crash your task. If that is a write to the SD card, that will likely corrupt that write.

ASUS Launches GTX 750 Ti Strix OC Edition With Twice the Memory

Subject: Graphics Cards | February 11, 2015 - 11:55 PM |
Tagged: strix, maxwell, gtx 750ti, gtx 750 ti, gm107, factory overclocked, DirectCU II

ASUS is launching a new version of its factory overclocked GTX 750 Ti STRIX with double the memory of the existing STRIX-GTX750TI-OC-2GD5. The new card will feature 4GB of GDDR5, but is otherwise identical.

The new graphics card pairs the NVIDIA GM107 GPU and 4GB of memory with ASUS’ dual fan "0dB" DirectCU II cooler. The card can output video over DVI, HDMI, and DisplayPort.

Thanks to the aftermarket cooler, ASUS has factory overclocked the GTX 750 Ti GPU (640 CUDA cores) to a respectable 1124 MHz base and 1202 MHz GPU Boost clockspeeds. (For reference, stock clockspeeds are 1020 MHz base and 1085 MHz boost.) However, while the amount of memory has doubled the clockspeeds have remained the same at a stock clock of 5.4 Gbps (effective).

Asus GTX 750Ti Strix 4GB Factory Overclocked Graphics Card.jpg

ASUS has not annouced pricing or availability for the new card but expect it to come soon at a slight premium (~$15) over the $160 2GB STRIX 750Ti.

The additional memory (and it's usefulness vs price premium) is a bit of a headscratcher considering this is a budget card aimed at delivering decent 1080p gaming. The extra memory may help in cranking up the game graphics settings just a bit more. In the end, the extra memory is nice to have, but if you find a good deal on a 2GB card today, don’t get too caught up on waiting for a 4GB model.

Source: TechPowerUp

Tom's Hardware Tests USB 3.1 on MSI's X99A Gaming 9 ACK

Subject: General Tech, Motherboards, Storage | February 11, 2015 - 09:59 PM |
Tagged: usb 3.1, usb, msi, asmedia

UPDATE: Not to be self-serving, but we have our own story online now looking at the performance of early USB 3.1 hardware on PC Perspective as well! Be sure to check that out!

USB 3.0, for storage, is fast. If you are using an external, spindle-based hard drive, it will perform basically as fast as an internal sibling would. Apart from my two SSDs, I do not even have an internal drive anymore. You can safely install games to external hard drives now.

usb-ss-logo.png

But with USB 3.1, the spec doubled to 10 Gbps, which matches the first generation Thunderbolt connector. A couple of weeks ago, Tom's Hardware put it to the test with an ASMedia USB3.1 to SATA 6 Gbps developer board. Sure enough, when you are raiding a pair of Intel 730 SSDs, you can achieve over 700 MB/s read/write in CrystalDiskMark.

About the most interesting part of Tom's Hardware testing is their CPU usage benchmark. While USB 3.0 on Intel's controller choked a CPU thread, USB 3.1 on ASMedia's controller did not even reach half of a thread's maximum (the CPU in question is a Core i7-5930K Haswell-E at 3.5 GHz).

So until we get flash drives that are constrained by USB 3.0's fairly high ceiling, we might be able to have reduced CPU usage.

Cherry JD-0400EU Mouse and Keyboard Are Encrypted

Subject: General Tech, Cases and Cooling | February 11, 2015 - 09:36 PM |
Tagged: cherry, AES, aes-128, wireless mouse, wireless keyboard, logitech

When we report on Cherry Corp, it is usually about their mechanical switches that are the basis (until just recently) of most mechanical keyboards. They also make full keyboards, including non-mechanical varieties, although they are usually designed for enterprise customers. This one is likely intended for that audience.

cherry-jd-0400eu.jpg

Simply put, The Cherry JD-0400EU is a wireless keyboard and mouse combo that encrypts all traffic with 128-bit AES encryption. If you are wondering why no-one else thought to do this? They did. Even Logitech has a whole line-up of 128-bit AES-encrypted mouse and keyboard combos. This is not even a feature that is only filled by niche companies.

Still, making sure people know that your wireless peripheral is encrypted will probably let you access a whole new audience of government, enterprise, and health care customers. The keyboard itself is based on scissor-switches, which are those non-removable keys that you find on many laptops. They are not high-performance, but they can be quite thin and low-profile. The switch mechanism under the scissor struts is membrane-based.

Pricing and availability are not yet listed.

Source: Cherry

Free Intel Edison Meetup in Phoenix, AZ, February 19th, 2015

Subject: General Tech, Processors, Systems | February 11, 2015 - 09:07 PM |
Tagged: Intel, edison, meetup

This is just a quick note for a small subset of our audience. If any of our developer-minded readers are in the Phoenix, Arizona region on February 19th, Intel will be hosting a meetup at UAT (the University of Advancing Technology). The processor vendor will perform a technical presentation about the Edison Internet-of-Things (IoT) developer kit. Shortly after the presentation, the group will move to Aunt Chilada's for a social event.

intel-edison-front.jpg

The presentation will take place in the theatre (there is only one as far as I can tell) at 6:30pm. Admission is free and there will be 10 Intel Edison kits to be raffled. Food and beverages will be provided by Intel (at Aunt Chilada's restaurant).

Source: Intel

Google I/O 2015 Dated. Also: Sharks Left Eggs?

Subject: General Tech, Shows and Expos | February 11, 2015 - 08:26 PM |
Tagged: google io 2015, google io, google

Or is that Left Shark Eggs? Yup, pay attention near the end of the post for an Easter Egg.

Every year, Google hosts their I/O developer conference, which often involves the launching of new hardware and services. This year, it will take place on May 28th and May 29th. Registration to register will open on March 17th at noon ET and it will end on the 19th. If you do not get in, many keynotes and sessions will be broadcast over the internet... because it's Google.

google-io-2015-invite.jpg

Note how I said “Registration to register”? That was not a typo. You are putting your name into a randomizer that will select candidates to actually register and purchase their tickets. Last year, tickets sold out in less than an hour. Apparently Google believes that it is better for the tickets to go to the luckiest individuals, rather than the best F5ers.

Now onto that aforementioned Easter Egg. A recent meme is “Left Shark” and “Right Shark”, which came to life from Katy Perry's Superbowl Half-Time show. The invitation page for Google I/O has a Chrome Experiment that plays music in the browser via WebAudio, with WebGL-based string and oscilloscope effects. For the Easter Egg, open up the developer console with F12 and type “experiment.consoleDance()”. This runs a JavaScript function that paints Left Shark dancing away in your developer console, realized with glorious ASCII art. I am not sure if Microsoft's Christian Heilmann (formerly of Mozilla) found this, or was just the first person that I personally saw talk about it. Either way, his video is embed above.

I hope this made your day as bright as mine. Basically, I HOPE I RUINED YOUR DAY TOO!

Source: Google

NVIDIA Event on March 3rd. Why?

Subject: Graphics Cards, Mobile, Shows and Expos | February 11, 2015 - 03:25 PM |
Tagged: Tegra X1, nvidia, mwc 15, MWC, gdc 2015, GDC, DirectX 12

On March 3rd, NVIDIA will host an event called “Made to Game”. Invitations have gone out to numerous outlets, including Android Police, who published a censored screenshot of it. This suggests that it will have something to do with the Tegra X1, especially since the date is the day after Mobile World Congress starts. Despite all of this, I think it is for something else entirely.

nvidia-march-3-2015-event.png

Image Credit: Android Police

Allow me to highlight two points. First, Tech Report claims that the event is taking place in San Francisco, which is about as far away from Barcelona, Spain as you can get. It is close to GDC however, which takes also starts on March 2nd. If this was going to align with Mobile World Congress, you ideally would not want attendees to take 14-hour flight for a day trip.

Second, the invitation specifically says: “More than 5 years in the making, what I want to share with you will redefine the future of gaming.” Compare that to the DirectX 12 announcement blog post on March 20th of last year (2014): “Our work with Microsoft on DirectX 12 began more than four years ago with discussions about reducing resource overhead. For the past year, NVIDIA has been working closely with the DirectX team to deliver a working design and implementation of DX12 at GDC ((2014)).”

So yeah, while it might involve the Tegra X1 processor for Windows 10 on mobile devices, which is the only reason I can think of that they would want Android Police there apart from "We're inviting everyone everywhere", I expect that this event is for DirectX 12. I assume that Microsoft would host their own event that involves many partners, but I could see NVIDIA having a desire to save a bit for something of their own. What would that be? No idea.

Samsung's phabulous phablet, the Note 4

Subject: Mobile | February 11, 2015 - 02:36 PM |
Tagged: Samsung, Note 4, Exynos 5433, snapdragon 805, phablet

At 5.7" and 176g the Samsung Note 4 is a large device and it has a resolution to match it at 2560x1440.  That resolution does slow it down somewhat, in graphics tests it does fall behind the iPhone 6 Plus except in Basemark X and 3DMark's Ice Storm test but it does show up the competition when it comes to graphical quality with only NVIDIA's Shield beating it on the GFXBench Quality tests.  In the CPU tests it scored moderately well on single threaded applications but wipes the floor with the competition when it comes to multi-threaded performance which you should keep in mind when choosing your purchases.  To see more benchmarks and details The Tech Report's full review can be found right here.

note4-pen.jpg

"Most of the world gets a variant of Samsung's Galaxy Note 4 based on Qualcomm's familiar Snapdragon 805 system-on-a-chip (SoC). In Samsung's home country of Korea, though, the firm ships a different variant of the Note 4 based on Samsung LSI's Exynos 5433 SoC. With eight 64-bit CPU cores and a 64-bit Mali-T760 GPU, the Eyxnos 5433 could make this version the fastest and most capable Note 4--and it gives us some quality time with the Cortex-A53 and A57 CPU cores that will likely dominate the Android market in 2015."

Here are some more Mobile articles from around the web:

Mobile

Is there a Hardline of difference between AMD and NVIDIA?

Subject: General Tech | February 11, 2015 - 01:44 PM |
Tagged: gaming, ea, battlefield hardline

Battlefield Hardline is in public beta for those who have tired of Battlefield 4 and are looking for a new online Frostbite 3 shooter to play and [H]ard|OCP has run benchmarks to show you what kind of performance you can expect.  They gathered together three cards from the two companies, a GTX 980, 970 and 960 as well as an R9 290X DD, 290 and 285 with a mix of default and factory overclocked frequencies.  As of yet there is no Mantle support in the beta so both vendors are using DX11 in the tests, with the top four cards at 2560x1440 and the remaining two at 1080p, all set to 4X MSAA and Ultra settings except for the Dustbowl map.  The GTX 980 takes top spot but the most interesting results are the 290X and 970; the difference is so minuscule that they essentially perform at the same level and the same can be said of the pricing.  Also worthy of note is that in only one test did the cards use more than 3GB of RAM and never hit 3.5GB.

1423459013VEZSb9gsF8_1_1.jpg

"We hopped on the open public beta of Battlefield Hardline this past week and tested performance in all three maps with six video cards to find out how this game performs. We will talk about each map in the beta, and our experiences in terms of performance and gameplay experience so that you will know what to expect in the full game."

Here is some more Tech News from around the web:

Gaming

Source: [H]ard|OCP

Another slice of Pi, this time it is saucier

Subject: General Tech | February 11, 2015 - 01:03 PM |
Tagged: Raspberry Pi 2 Model B, open source

We have seen the improved specs of the new Raspberry Pi 2 Model B, with a quad-core Cortex-A7 based Broadcom SoC running at 900MHz, 1GB RAM, a 40 pin I/O connector block and four USB 2.0 ports, with the rest of the inputs remaining the same as the previous models.  It will still charge at 2A but you will need an SD card of at least 4GB to fit on the OS, which you can buy preloaded or configure yourself if you so desire.  The Inquirer used the SunSpider JavaScript benchmark to test the speed of the new Raspberry and the results matched the advertised gains, 4452.1ms to completion on the new model as compared to 23692.7ms on the Raspberry Pi Model B+.  If you are working this devices predecessors on a project that would benefit from more power, or Windows 10 support, this will be a great investment for you.

raspberry-pi-comparison-540x334.jpg

"SWELLING THE RANKS of fruity-themed computers, the Raspberry Pi 2 is an upgraded version of the popular single-board computer, sporting a new processor and double the memory of previous models."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

Oops! Incorrect AMD CPUs Allegedly Sold on Amazon

Subject: General Tech, Processors | February 11, 2015 - 09:00 AM |
Tagged: amd, amazon

So allegedly Amazon UK sold some AMD A8-7600 APUs, but they actually shipped Athlon 64 X2 5200+ CPUs. Despite what you would think, it was actually “dispatched and sold” by Amazon UK itself, rather than a dishonest seller who has some explaining to do. For those affected, Amazon is apparently handling customer service well, as expected, and promptly replacing the parts. It does not seem to affect other regions, and the problem started just a short time ago.

amd-new.png

Unless you're Sebastian, these processors will not even fit in the motherboard socket. PC World has an interesting side-by-side comparison of the two pin configurations. They do not look alike at all. You should not have a hard time identifying the problem if you are careful enough to look before you insert, which is obviously something that you shouldn't have to do. Also, AMD refers customers to their authenticity support page for a few extra ways to be sure that the box that you got came from AMD.

What would be the most interesting part of this story is finding out what happened. Unfortunately, we probably will never know, unless it turns into a famous legal battle of some sort.

Source: Tech Report