All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | February 27, 2013 - 03:50 PM | Jeremy Hellstrom
Tagged: gaming, Crysis 3
While Crysis 3 is an incredible benchmarking tool, bringing even the new TITAN to its knees when played on multiple monitors it was actually designed to be a game that you play. With Crysis 2 being such a disappointment to PC gamers after the incredible sandbox which was Crysis, many gamers are a little leery of paying full price for the third installment. Techgage recently published a review of the game, all 6 hours or so of playtime. The suit and weapons upgrade system returns, with the Strength power returning from the first game for those who like a more physical game. They also added an unfortunate hacking mini-game and the AI might be better than Colonial Marines but is still in need of some polishing. Multiplayer is strong, as anyone who tried out the Alpha or Beta can attest to but for the single player you should read through the review if you are on the fence about purchasing this game.
"Following the events of Crysis 2, we find ourselves back in New York – but this time, things have changed. The city has turned into an urban rainforest, and while CELL is working towards global domination, Ceph continue their war against humanity. Only one man can save mankind: “They call me Prophet”."
Here is some more Tech News from around the web:
- Crysis 3 @ Kitguru
- Homeworld’s For Sale, And Supreme Commander Too
- Assassin’s Creed 4 Gets Pirated @ Rock,Paper,SHOTGUN
- Raging Rainbow: Bit.Trip Runner 2 Out Now @ Rock,Paper,SHOTGUN
- Rifle Report: Arma 3 Enters Playable Alpha On March 5th @ @ Rock,Paper,SHOTGUN
- Balls For The Ball God – Blood Bowl II Announced @ Rock,Paper,SHOTGUN
- Metal Gear Rising: Revengeance PlayStation 3 @ Tweaktown
- Unigine Oil Rush for Android Review @ Hi Tech Legion
Subject: General Tech | February 27, 2013 - 03:47 PM | PCPer Staff
Dell UltraSharp U2913WM panoramic 29" 2560 x 1080 LED-backlit LCD Monitor for $629.10 with Free Shipping (normally $699.00 - use coupon code: 2SWVM6553NQ6F7).
PNY Prevail 2.5" 120GB SATA 6Gb/s SSD for $107.99 with Free Shipping(normally $120 - use coupon code: VZQG7WPT?PJ4C4).
Western Digital 2TB My Book Live Personal Cloud External Drive for $123.95 with Free Shipping(normally $120 - use coupon code: DIG5).
Altec Lansing M812 Octiv Air Speaker Dock for $94.95 with Free Shippin (normally $400 - use coupon code: DIG5).
EdgeStar 84 Can Beverage Cooler for $199.99 with Free Shipping (normally $300 - use coupon code: BEVFEB30).
Subject: General Tech | February 27, 2013 - 01:11 PM | Jeremy Hellstrom
Tagged: ssd, enterprise ssd, SAS, micron, micron p410m
Micron has announced a new SSD, the P410 SSD which will use a Serial Attached SCSI interface, perfect for dropping into existing enterprise servers. SATA is perfectly fine for SOHO users and enthusiasts but for large businesses with a need for extreme reliability, SAS has been the interface of choice. Adoption of SSDs has been slowed in large businesses in part because it would require changing the existing architecture to SATA in order to incorporate SSDs into their systems. With the new Micron drive that is no longer necessary, at 7mm it will support high density servers and with the 25nm MLC NAND it is expected to survive for five years of duty with 10 full drive fills every day. Read more at DigiTimes.
"Micron Technology has announced another addition to its growing lineup of solid state drives (SSDs) targeted at data center appliances and enterprise storage platforms. The new Micron P410m SSD is a high-endurance, high reliability 6Gb/s Serial Attached SCSI (SAS) drive built to provide the performance necessary for mission-critical tier one storage applications that require uninterrupted, 24/7 data access."
Here is some more Tech News from around the web:
- AMD to sell a cut down version of Sony's Playstation 4 APU @ The Inquirer
- Google offers a single sign-on system, embraces 10 partners @ The Inquirer
- Benchmarking Ubuntu Linux On The Google Nexus 10 @ Phoronix
- Intel takes on all Hadoop disties to rule big data munching @ The Register
- Stuxnet worm dates back to 2005, Symantec reveals @ The Inquirer
- First Debian/Ubuntu Bootable ARM64 Images Released @ Slashdot
Subject: Editorial, General Tech, Systems | February 26, 2013 - 08:07 PM | Scott Michaud
Tagged: ps4, unreal engine 4
Unreal Engine 4 was present at Sony's Playstation 4 press conference, but that is no surprise. Epic Games has been present at several keynotes for new console launches. Last generation, Unreal Engine 3 kicked off both Xbox 360 and PS3 with demos of Gears of War and Unreal Tournament 2007, respectively. The PS4 received a continuation of the Elemental Demo first released at the end of E3 last June.
All I could think about when I watched the was, “This looks pretty bad. What happened?”
If you would like to follow along at home, both demos are available on Youtube:
As you can see from the animated GIF above, particle count appears to have been struck the worst. The eyes contain none of the particle effects in the PS4 version. There appears to be an order of magnitude or two more particles on the PC version than the PS4. There are no particle effects around the eyes of the statue. Whole segments of particles are not even rendered.
In this screenshot, downsampled to 660x355, the loss of physical detail is even more apparent. The big cluster of particles near the leg are not present in the PS4 version and the regular cluster is nowhere near as densely packed.
And the lighting, oh the lighting.
On the PS4 everything looks a lot higher contrast without a lot of the subtle lighting information. This loss of detail is most apparent with the volcano smoke and the glow of the hammer but are also obvious in the character model when viewed in the video.
Despite the 8GB of RAM, some of the textures also seem down-resolution. Everything appears to have much more of a plastic look to it.
Still, while computers still look better, at least high-end PC gaming will still be within the realm of scalability for quite some time. We have been hampered by being so far ahead of consoles that it was just not feasible to make full use of the extra power. At least that is looking to change.
Subject: General Tech | February 26, 2013 - 01:45 PM | Jeremy Hellstrom
Tagged: mcafee, security, RSA 2013, sandbox
McAfee has been showing off their stuff at RSA 2013 specifically the new heuristic malware detection capabilities which they will be using instead of their current malware signature database which has over 113 million core samples. That signifies a huge change for the antivirus company as it moves to real time monitoring of all the processes on your machine for suspicious activity instead of matching patterns directly. While this could lead to some interesting side effects for verification software such as you find in some games, McAfee claims 100% effectiveness against current rootkits on Intel hardware compatible with Deep Defender, though they did not give many specifics about that test to The Register.
That is not all they are up to, McAfee just purchased Validedge's sandboxing technology to allow them to watch malware as it arrives and infects a machine to allow them to study its patterns. Strangely, The Inquirer mentions that they will be recording the signature so it is possible that it is an exaggeration that they are completely abandoning their signature database altogether and will be using a hybrid database and heuristic monitoring. The first software using this new option will be available in the second half of this year. Also briefly mentioned in the story is a suggestion that McAfee will be able to repair infected computers automatically via the ePO Agent.
"Signature-based malware identification has been around since the dawn of the computer security industry, but McAfee has said it's dumping the system – or rather, adapting it – in an upgraded security suite which will (it claims) virtually eliminate susceptibility to botnets."
Here is some more Tech News from around the web:
- Altera signs up to use Intel's upcoming 14nm process node @ The Inquirer
- HP offloads WebOS to LG for use in televisions @ The Inquirer
- Internet Explorer 10 for Windows 7 @ [H]ard|OCP
- Samsung, Visa in pay-by-bonk tie up @ The Register
- Not so fast, BlackBerry. Now Samsung wants your tasty biz mobe pie @ The Register
- Hacking the International Space Station with a toothbrush @ Hack a Day
- Super single-photon source for quantum computers @ nanotechweb
Subject: Editorial, General Tech, Systems, Mobile, Shows and Expos | February 26, 2013 - 04:19 AM | Scott Michaud
Tagged: Firefox OS, mozilla, firefox, MWC, MWC 13
Mobile World Congress is going on at Barcelona and this year sees the official entry of a new contender: Firefox OS.
Mozilla held their keynote speech the day before the official start to the trade show. If there is anything to be learned from CES, it would be that there is an arms race to announce your product before everyone else steals media attention while still being considered a part of the trade show. By the time the trade show starts, most of the big players have already said all that they need to say.
If you have an hour to spare, you should check it out for yourself. The whole session was broadcast and recorded on Air Mozilla.
The whole concept of Firefox OS as I understand it is to open up web standards such that it is possible to create a completely functional mobile operating system from it. Specific platforms do not matter, the content will all conform to a platform of standards which anyone would be able to adopt.
I grin for a different reason: should some content exist in the future that is intrinsically valuable to society, its reliance on an open-based platform will allow future platforms to carry it.
Not a lot of people realize that iOS and Windows RT disallow alternative web browsers. Sure, Google Chrome the app exists for iOS, but it is really a re-skinned Safari. Any web browser in the Windows Store will use Trident as its rendering engine by mandate of their certification rules. This allows the platform developer to be choosey with whichever standards they wish to support. Microsoft has been very vocally against any web standard backed by Khronos. You cannot install another browser if you run across a web application requiring one of those packages.
When you have alternatives, such as Firefox OS, developers are promoted to try new things. The alternative platforms promote standards which generate these new applications and push the leaders to implement those standards too.
And so we creep ever-closer to total content separation from platform.
Subject: General Tech | February 26, 2013 - 03:29 AM | Tim Verry
Tagged: tracking cookie, Privacy, firefox 22, cookies
Mozilla’s Firefox web browser continues to add new features. A recent patch submitted by Jonathan Mayer proposes an interesting change to the way the browser handles third party cookies. The patch is suggested to be rolled into Firefox 22, and should it be approved, the open source browser would adopt Safari-like behavior by blocking third party cookies by default. Specifically, the patch would change the default behavior to block third party cookies by default unless the user has visited the website themselves at some point. Users will also be able to tweak the setting via a UI menu item and choose whether to always block third party cookies, only allow cookies from previously visited sites, or allow all third party cookies (for comparison, Google Chrome goes with this option as its default).
This is a positive move for consumer privacy, but it is also a disruptive strike at online advertisers. So called third party cookies are tidbits of code that sites can utilize to identify and track users on other sites. The uses of cookies can range from a shopping site using cookies for shopping carts or coupons to ad networks that track you across the internet to deliver targeted advertising and gather information about users. Safari has managed to get away with blocking third party cookies by default so far, but Firefox has a great deal more market share. Should Firefox move to a block-by-default model, advertisers are not likely to be pleased considering they think that Do Not Track is bad enough (heh). I think it may need to be relaxed somewhat, but the proposed patch’s behavior is closer to a fair balance between privacy and tracking than the current arrangement.
Currently, you can choose to accept all or block all (with accept all being the default). The new patch would add a new option to the GUI menu to only allow cookies from previously visited sites.
Interestingly, this is not the first time that changes to Firefox’s cookie handling behavior has been proposed. A few years ago, developers considered a similar patch but found that it caused too many problems with websites. It is worth noting that Jonathan Mayer's patch is not as strict in what it blocks as that previous patch attempt, so it is more likely to be approved--and break fewer sites out of the box. Then again, the more browsers that adopt a block-by-default policy for third party cookies, the more websites will be pressured into finding workarounds such as poxy-ing the third party ad cookies from their own domain (making the cookies first party as far as the browser is concerned). In the end, the battle between consumers and advertisers will rage on with websites/publishers caught in the middle tryng to find an acceptable balance.
It will be interesting to see whether this patch goes through and what the fallout (if any) will be.
What do you think about the proposed change to the default cookie handling setting? Are you already using a third party browser plugin with a white list to block them by default anyway?
Also Read: Firefox 19 Includes Built-In PDF Viewer @ PC Perspective.
Subject: General Tech | February 26, 2013 - 02:54 AM | Scott Michaud
Tagged: 2K, bioshock, bioshock infinite
So it was announced just a couple of days ago that Bioshock Infinite will be boosted with three pieces of expansion DLC. What will they be? Who knows! Rest assured, the marketers have declared there will be “new stories, characters, abilities, and weapons.”
Phew! I was worried that I would only get a soundtrack or something... wait, no I wasn't.
Give me more money, or I cut you.
Publishers, these days, have been looking for new methods to increase the price of games and prevent their discs from being resold on the used market. We seem to be escaping the dark era where single-player games were condemned as fiscal black holes from which your capital would never be seen again. The view was that a solo experience would be completed before they finished monitoring their sales figures and the used market would eat the rest of their sales curve. The solution clearly was to toss even more capital at those games to tack on a multiplayer component that no-one played and make the loss look really bad on paper and further justify your fears of used sales and piracy.
And really we are part of the problem as consumers when we expect the $50 or $60 price-point. Of course, we expect that price-point because we have been conditioned to expecting that value fairly across-the-board. We have begun to see games, mostly indie titles, come in at lower launch prices in particular with digital distribution platforms.
The biggest problem is this: publishers do not need to find the largest value customers would pay for their content; publishers need to find the largest product of any given price and its corresponding probability of purchase for all potential buyers. On Steam you see this explode with sales where a moderate price reduction yields a massive sales increase with even a halo effect when the price returns to its norm.
So what about Bioshock?
In this case, honestly, the game will probably be worth more than its $60 price tag when speaking from a development effort versus the risk in finding its audience standpoint. As such, the publisher will add some attach rate of slightly extra content for a moderate price addition. This is one more example of how members of the industry continue to avoid risks. In this case, they want to spread the risk out over multiple products.
At least they didn't, you know -- be Irrational (heh heh heh), and toss that development money going after the used sales boogyman. At least they will get the money they expect.
Subject: General Tech | February 25, 2013 - 04:06 PM | Jeremy Hellstrom
Tagged: input, gaming keyboard, gaming mouse, AiZO, L3VETRON GM2000, L3VETRON Mech5, L3VETRON
At a glance, AiZO's L3VETRON Mech5 keyboard has a lot of extras for gamers, with a keypad that can be relocated to the left side of the keyboard as well as a 6 key macro pad which can be attached almost anywhere and be assigned up to 12 functions. There is also a volume knob and a switch to disable the Windows key for those games which just don't like losing focus. Unfortunately, while these extras did function reasonably well Overclockers Club thought the overall design felt rather cheap and not what they expected from a keyboard that costs over $100.
The GM2000 mouse came out a bit better, in part because it costs around $40, but also thanks to the light weight and DPI features. On the other hand they ran into problems with button response and from other reviews of this mouse they are not the only ones who did. Still, it is reasonably priced and will get you in the game quickly.
"Mechanical switches are becoming the typical switch in most gaming or enthusiast builds – at this point just having the mechanical switch isn't enough to warrant such high dollar signs. That is where I feel the L3VETRON Mech5 was a major let down. It reminded me of the toy that looked super in the box until you saved up your money to buy it and find out what crap it actually was. The features of a movable number pad as well as the little Macro keypad do deserve some merit in the overall review. Although I'm not big on using macros the ability to choose to have them is nice while not massively increasing the standard layout of the keyboard. The removable ability and varied placement of the number pad was by far my favorite part of all of the AZiO products today. Just the ability to customize my layout in a LEGO sort of manner was like being a kid all over again – loved it."
Here is some more Tech News from around the web:
- AZiO Levetron GM-2000 Mouse Review @ Hi Tech Legion
- Zowie AM - Ambidextrous Mouse @ eTeknix
- Logitech Bluetooth Easy-Switch Keyboard Review @ Legit Reviews
- Cooler Master CM Storm Sentinel Advance II Gaming Mouse @ Modders-Inc
- Logitech G600 MMO Gaming Mouse Review @ Legit Reviews
- Func MS-3 Mouse and Surface 1030XL Mousepad Review @ Hi Tech Legion
- SteelSeries Free Mobile Wireless Controller Review @ Madshrimps
- Rosewill Gaming Keyboard RK-8100 @ Benchmark Reviews
- Gigabyte Aivia Osmium Mechanical Keyboard @ Benchmark Reviews
- Das Keyboard Model S Professional @ Benchmark Reviews
- CM STORM TRIGGER Mechanical Gaming Keyboard Review @ NikKTech
- AZiO Levetron Mech5 Modular Mechanical Keyboard Review @ Neoseeker
Subject: General Tech, Graphics Cards | February 25, 2013 - 01:32 PM | Jeremy Hellstrom
Tagged: jon peddie, graphics, market share
If last weeks report from Jon Peddie Research on sales for all add in and integrated graphics had you worried, the news this week is not gong to help boost your confidence. This week the report focuses solely on add in boards and the drop is dramatic; Q4 2012 sales plummeted just short of 20% compared to Q3 2012. When you look at the entire year, sales dropped 10% overall as AMD's APUs are making serious inroads into the mobile market, as are Intel's, with many notebooks being sold without a discrete GPU. The losses are coming from the mainstream market, enthusiast level GPUs actually saw a slight increase in sales but the small volume is utterly drowned by the mainstream market. You can check out the full press release here.
"JPR found that AIB shipments during Q4 2012 behaved according to past years with regard to seasonality, but the drop was considerably more dramatic. AIB shipments decreased 17.3% from the last quarter (the 10 year average is just -0.68%). On a year-to-year comparison, shipments were down 10%."
Here is some more Tech News from around the web:
- 3DMark Review @ OCC
- Trendnet N300 Easy-N-Range Extender @ Rbmods
- NETGEAR ProSafe GS110T Gigabit SmartSwitch @ Benchmark Reviews
- Quantum computer one step closer after ‘true’ quantum calculation @ The Register
- Microsoft brings Azure back online @ The Register
- Understanding Camera Optics & Smartphone Camera Trends, A Presentation by Brian Klug @ AnandTech
- MWC Sunday roundup: HP Slate, Ascend P2 and Firefox phones @ The Inquirer
- AMD releases Firepro R5000 with remote display technology @ The Inquirer
- The TR Podcast 129: PlayStation 4, Titan, and more
Subject: General Tech | February 25, 2013 - 12:47 PM | PCPer Staff
Logitech G710 USB 2.0 Wired Gaming Mechanical Keyboard for $127.49 with Free Shipping (normally $150 - use coupon code: 8LG7XLCVX7VL21).
Western Digital 4TB My Book Essential USB 3.0 Desktop Hard Drive (WDBACW0040HBK) for $164.00 with Free Shipping(normally $720 - use coupon code: DIG5).
Technine LM Pro Flat Snowboard Magoonies for $359.96 (normally $450).
Star Wars Darth Vader Bank for $19.27 (normally $30).
Timex Im Global Trainer GPS S+D Watch for $190.95 with Free Shipping (normally $300).
Subject: General Tech, Systems | February 25, 2013 - 02:18 AM | Scott Michaud
Tagged: Chromebook Pixel, Chromebook
We have covered many Chrome OS-based devices, even a pair of reviews, but we have never seen the platform attempt to target the higher-end of the price spectrum. As you could guess by my ominous writing tone, that has changed.
The development commentary video could have been an Apple advertisement. We will embed it below, but it definitely had that whimsical tone we all know and groan. The Pixel was heavily focused on design and screen quality.
The display is quite small, just under 13”, but it has a higher resolution than professional-grade 30” monitors. It leapfrogs Catleap. When trying to visualize the use case, the first thought which comes to mind is a second PC for someone to take with them. If you can get a really high resolution experience with that, then bonus. Right?
The specifications, according to their Best Buy product page, are actually quite decent for a web browser-focused device.
- Ivy Bridge Core i5
- 4GB DDR3 RAM
- 32GB SSD
- Intel HD 4000 Graphics
- With the low cost of RAM
The downside? The price starts at $1299 USD and goes up from there. You can get a larger SSD and LTE for just 150$ more, at the $1449 price point if you can wait until April.
Once you factor in the price, and a mighty big factor that is too, it makes it really difficult to figure out who Google is targeting. The only explanation which makes sense to me is a high-end laptop which is easy for IT departments to manage for executives and students.
Lastly, 4GB of RAM is ridiculously cheap nowadays. Could it have killed them to add in a little extra RAM to get more headroom?
Also, what about the lack of connectivity to external displays? (Update: Sorry, just found mini displayport on the product tech specs.)
Subject: General Tech | February 22, 2013 - 12:23 PM | Jeremy Hellstrom
Tagged: nvidia, jen-hsun huang
NVIDIA will have a new nerve center across the street from their existing headquarters as from what Jen-Hsun told The Register they are almost at the point where they need bunk-desks in their current HQ. The triangle pattern that the artists concepts shown not only embodies a key part of NVIDIA's technology but is also a well recognized technique in architecture to provide very sturdy construction. Hao Ko was the architect chosen for the design, his resume includes a terminal at JFK airport as well as a rather tall building in China. For NVIDIA's overlord to plan such an expensive undertaking shows great confidence in his companies success, even with the shrinking discrete GPU market.
"Move over Apple. Nvidia cofounder and CEO Jen-Hsun Huang wants to build his own futuristic space-station campus – and as you might expect, the Nvidia design is black and green and built from triangles, the basic building block of the mathematics around graphics processing. And, as it turns out, the strongest shape in architecture."
Here is some more Tech News from around the web:
- TSMC supply of 28nm chips remains tight @ DigiTimes
- AMD and The Sony PS4. Allow Me To Elaborate @ AMD
- Google reveals Glass details in patent application @ The Register
- Build your own dumb USB power strip @ Hack a Day
- ARM and Synopsys tape out a Mali-T658 GPU at 20nm @ The Inquirer
- Philips Gioco review: Ambiglow on your desk and much more @ Hardware.Info
- Win a MSI GTX 670 Twin Frozr Power Edition OC 2GB @ eTeknix
Subject: General Tech | February 22, 2013 - 07:31 AM | Tim Verry
Tagged: servers, facebook, exabyte, data centers, cold storage, cloud computing
Facebook is planning to construct a new cold storage facility to house archived and less-frequently-used media files. The new data center will reside in a new 62,000 sq. ft. building on the company's existing 127-acre property in Prineview, Oregon.
As cold storage, the data center will house servers with up to 3 Exabytes of total data capacity. The machines will be in a sleep state the majority of the time, but will be automatically turned on to serve up media files when accessed on the social network. Because the servers are normally in a lower-power sleep state, there will be a slight delay when users request files. According to Oregon Live, Facebook has stated that the delay will be as much as a couple of seconds and as little as several milliseconds.
The new cold storage facility will enable Facebook to save a great deal on electrical usage and hardware wear and tear (though primarily power bill savings). The company claims that its users upload 350 million photos each day, but that 82% of the social networking site's traffic focuses on a mere 8% of available photos.
Err, not quite the cold storage Facebook has in mind...
Considering Facebook's existing Prineview data center used a whopping 71 million Kilowatts of power in the first 9 months, moving to a new cold storage system for infrequently accessed files is an excellent idea. The photos will still be available, but Facebook will save big on the power bill--a fair compromise for retaining all of those lolcat and meme photos, i think.
The new data center will be rolled out in three phases, each measuring 16,000 sq. ft. in the Prineview facility. The first phase of cold storage servers should be up and running by Q4 2013. There is no estimate on the power savings, but it will be interesting to see how beneficial it will be--and whether other cloud service providers will adopt similar policies.
Also read: Amazon Glacier offers cheap long-term storage.
Subject: General Tech | February 22, 2013 - 05:11 AM | Tim Verry
Tagged: web browser, pdf viewer, mozilla, firefox
Additionally, Mozilla has fixed several bugs and improved performance. The browser will now start-up more quickly than previous versions, and a WebGL drawing operation error has been corrected, for example. Further, Firefox 19 now recognizes more CSS features including @page and support for fixed-width text transformations. A new debugger has also been added to Firefox 19, which should help add-on developers test their code. Also in interesting news, mobile users running Firefox for Android will also be pleased to know that Mozilla has relaxed the CPU clockspeed requirement to a mere 600 MHz–allowing the mobile browser to run on even more Android devices.
The new version is available for download from the Mozilla website.
Linaro Forms Linux Networking Group to Collaborate on Open Source Software for ARM Networking Hardware
Subject: General Tech | February 22, 2013 - 02:16 AM | Tim Verry
Tagged: oss, open source, networking, linux networking group, linux, linaro, arm
Linaro, a non-profit engineering group, announced a new collaborative organization called the Linux Networking Group at the Embedded Linux Conference in San Francisco this week. The new group will work on developing open source software to be used with ARM-based hardware in cloud, mobile, and networking industry sectors. Of course, being open source, the software for ARM SoCs will be used with Linux operating systems. One of the Linux Networking Group’s purposes is to develop a new “enhanced core Linux platform” for networking equipment, for example.
The new Linux Networking Group is currently comprised of the following organizations:
- Nokia Siemens Networks
- Texas Instruments
The new cooperative has announced four main goals for 2013:
- "Virtualization support with considerations for real-time performance, I/O optimization, robustness and heterogeneous operating environments on multi-core SoCs.
- Real-time operations and the Linux kernel optimizations for the control and data plane.
- Packet processing optimizations that maximize performance and minimize latency in data flows through the network.
- Dealing with legacy software and mixed-endian issues prevalent in the networking space."
Reportedly, Linaro will have an initial software release within the first half of this year. Further, the organization will follow up with monthly software updates to improve performance and add new features. More collaboration and the furthering of ARM-compatible open source software is always a good thing. It remains to be seen how useful the Linux Networking Group will be in pushing its ARM software goals, but here’s hoping it works out for the best.
The full press release can be found below.
Subject: General Tech | February 21, 2013 - 12:27 PM | Jeremy Hellstrom
"Mozilla's Firefox web browser now includes a built-in PDF viewer - allowing users to bin plugins from Adobe and other developers.
Here is some more Tech News from around the web:
Subject: General Tech | February 21, 2013 - 02:58 AM | Ken Addison
Tagged: titan, Tegra 4i, tegra 4, ssd, ps4, podcast, nvidia, Intel
PC Perspective Podcast #239 - 02/21/2013
Join us this week as we discuss NVIDIA GTX TITAN, PlayStation 4 Hardware, SSD Endurance and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath and Allyn Malventano
Program length: 0:59:53
Podcast topics of discussion:
- Week in Reviews:
- 0:27:20 This Podcast is brought to you by MSI!
- News items of interest:
0:49:00 Hardware / Software Pick of the Week
- Ryan: Excel 2013.. or not.
- Jeremy: WobbleWorks $75 Pen-Sized 3D Printer on Kickstarter
- Josh: Sweet lookin Monitor... not without quirks
- Allyn: Gunnar
- 0:49:00 Hardware / Software Pick of the Week
- 1-888-38-PCPER or firstname.lastname@example.org
- http://twitter.com/ryanshrout and http://twitter.com/pcper
Subject: General Tech, Systems | February 20, 2013 - 06:53 PM | Scott Michaud
Tagged: sony, ps4
We're currently in the middle of Sony's Playstation announcement and right off the bat they discussed system specifications.
(Update 2: Press conference was over a few hours ago, and we now have an official press release.)
The Playstation 4, as it will be titled, is very similar to a mid-range gaming PC. When discussing with developers, they requested for Sony to stick with a typical x86-based architecture. Of course that does not stop Sony from describing it as a “Supercharged PC architecture”. Still, they do seem to have quite a decent amount of hardware in this box.
- 8-core x86 CPU
2 Teraflops GPU integrated on same
- I did not hear AMD mentioned, but it totally is.
- 8GB GDDR5 RAM (shared)
- Stereo Camera on the controller with a light bar, like the Wii, to judge distance to TV.
- Also touch sensor in the controller.
- (Update/correction: At least a ...) Spindle-based Hard Drive
While these specifications have been sufficiently leaked in the recent past, we have not been able to pin down exactly how much RAM is provided. We found the development kit contained 8GB of system memory. The problem is that development kits require more RAM than the system it pretends to be to account for development tools and unoptimized assets.
As it turns out, the system itself will contain 8GB of GDDR5 shared between the CPU and GPU, which is quite a lot. Developers will need to finally push the PC platform past the 4GB RAM+VRAM 32-bit barrier in order to keep up with the next generation consoles.
Most of our gaming limitations were due to art assets being limited by memory constraints. Thanks to the new Sony console, PC releases could finally be taken off the 512MB-long leash of Sony and Microsoft.
(Update 2, cont.: The press release has official tech specs as below but are "subject to change")
Single-Chip Custom Processor
CPU: x86-64 AMD "Jaguar", 8 cores
GPU: 1.84 Teraflops, AMD next-generation Radeon(tm)-based graphics engine
|Hard Disk Drive||Built-in|
|Optical Drive (Read-Only)||
BD 6x CAV
DVD 8x CAV
|I/O Ports||Super-Speed USB (USB3.0), AUX|
Ethernet (10BASE-T, 100BASE-TX, 1000BASE-T)
IEE 802.11 b/g/n
Bluetooth 2.1 (EDR)
So clearly Sony was slightly rounding up when they claimed it was a 2 Teraflop GPU. Still, this looks to be a healthy computer.
We now have the official confirmation we needed that AMD Jaguar cores will power the PS4. Given AMD's big wins in the console platforms, I would wonder if game developers would be able to take some of the tricks they will learn in a few years and be able to start optimizing PC gamers for AMD CPUs.
GPUs too for that matter... this could mean a lot for AMD's PC gamers.
Subject: General Tech | February 20, 2013 - 05:50 PM | Jeremy Hellstrom
Tagged: audio, JDSLabs, O2, ODAC, amp
If swappable op-amps, an Objective DAC for proper analog sound and a frequency range of 10hz - 19Khz not only makes sense to you but gets you a bit hot and bothered, JDSLabs has an interesting product for you. Built around the open source Objective2 amplifier this pre-amp and DAC is designed to attempt to produce professional quality sound without the price tag attached to that level of gear. The suggested price is $285 which might seem high for a headphone amp but is actually much less expensive than professional level kit. TechPowerUp were very impressed with the quality of the sound and the tweaks possible to make to the DAC but thought that perhaps a bit more thought could have been put into the aesthetics of the device.
"JDSLabs's O2+ODAC combination combines two designs by NwAvGuy with excellent build quality and a sturdy enclosure. The O2+ODAC sells at just $285 which, performance considered, is a bargain. We test whether this amplifier and DAC combination is really the giant slayer it is made out to be!"
Here is some more Tech News from around the web:
- Eagletech Arion Foldable Bluetooth 3.0 Headset Model:ARHP200BF @ TechwareLabs
- ASUS Orion Pro Headset Review @ Neoseeker
- House of Marley Redemption Song On-ear Headphones Review @ TechwareLabs
- Razer Star Wars: The Old Republic Gaming Headset Review @ Madshrimps
- AZiO Levetron GH808 Gaming Headset Review @ Custom PC Review
- Asus Orion Pro @ LanOC Reviews
- Mad Catz F.R.E.Q.5 Red Stereo Gaming Headset Review @ NikKTech
- HiFiMAN RE-400 In-ears @ techPowerUp
- Ineo Alienvibes S101 2.0 Desktop Speakers Review @ Hi Tech Legion
- Wavemaster Moody 2.1 Speaker System @ eTeknix