Subject: General Tech | March 12, 2012 - 09:54 PM | Tim Verry
Tagged: tDCS, overclocking, DIY, brain, augmentation
The new Extreme Tech (RIP ET Classic) recently ran an article that talks about turbo boosting, overclocking goodness, but with a twist. Instead of the typical CPU or GPU hardware, the article talks about overclocking some wetware in the form of the human brain. More specifically, a DIY kit called the GoFlow is in the works to enable affordable tDCS, or transcranial direct-current stimulation, to stimulate the brain into a "state of flow" enabling quicker learning and faster response times.
The GoFlow β1 will be a $99 do it yourself tDCS kit that will direct you in placing electrodes on the appropriate areas of your scalp and then pumping some direct current from a 9V battery at 2 milliamps through your brain, enticing the neurons into a state of flow. This makes them more malleable to creating new pathways and increasing learning speed as well as allowing them to fire more rapidly, bolstering thought processes. The kit, which is not available for purchase yet, will not require any knowledge of soldering, and will be housed a plastic case along with wires, schematics, and a potentiometer to dial in the right amount of power.
The company behind the GoFlow β1 has further referenced cases of successful tDCS short term testing including tests of UAV Drone pilots and professional gamers all learning their respective trades more quickly than the average. Extremetech also mentions that tDCS can have therapeutic effects for people effected by Parkinson’s or post stoke motor dysfunction.
Right now, the kit is still in the works, but interested users can sign up to be notified when it becomes available on their website. At $99, is this something work a shot, or do you prefer not to void the warranty on your brain? (heh) Personally, statements such as "our tDCS kit is the shit" and "get one of the first β1's and will help us develop β2" on the webstie are not exactly instilling confidence to me, but if you're big into the early adopter adventure, GoFlow may have something for you to test.
Subject: General Tech | March 12, 2012 - 03:20 PM | Jeremy Hellstrom
Tagged: nvidia, kelper, GK104, gtx680
Whether you are an NVIDIA fan or not you are likely anxiously awaiting the release of NVIDIA's new GPU. Since AMD is currently holding the lead on graphics cards they have no competition to lower the price of their high end cards which can beat anything NVIDIA currently has on the market. Depending on the price and performance of the new Kepler chip, its release should have an effect on the pricing of at least one line of AMD card, be it Pitcairn or Cape Verde. This is why SemiAccurate's pegging of the expected release dates, both paper and physical, are worth taking a look at, especially if your GPU recently died and you are looking at getting a new one. We should have a good idea of the price, performance and availability by the end of March.
"Today, March 8, is the day where the press that Nvidia flew in get their ‘tech day’, basically the deep dive on Kepler. Then, like we said a few days ago, Nvidia will paper launch the cards next Monday, March 12. From that point, things get a little hazy because people are arguing over two different days. Some are saying Friday March 23, others Monday March 26, with a few more saying 23 than 26. It could be none of the above though, but if you bet on two weeks out, you won’t be far off."
Here is some more Tech News from around the web:
- Why did TSMC stop 28nm production? @ SemiAccurate
- In-Depth with the Windows 8 Consumer Preview @ AnandTech
- Six screens, one desk @ The Tech Report
- Brewtarget: Hop into Beer Brewing with Open Source @ Linux.com
- Win an LG Optimus 3D, and £50 Credit from Three! @ Tech-Reviews.co.uk
- MASTERS OF OVERCLOCKING: Final Result @ kitguru
In a recent press release, the Linux Foundation added four new members, one of which is a big deal in the graphics card industry. In addition to the new members of Fluendo, Lineo Solutions, and Mocana is the green GPU powerhouse NVIDIA. According to Maximum PC, there is talk around the web of the company moving to open source graphics drivers; however, NVIDIA has not released anything to officially confirm or deny.
The Linux Foundation's Logo
Such a move would be rather extreme and unlikely, but it would certainly be one that is welcomed by the Linux community. Officially, the Vice President of Linux Platform Software Scott Pritchett stated the company is "strongly committed" to delivering quality software/hardware experiences and they hope their membership in the Linux Foundation will "accelerate our collaboration with the organizations and individuals instrumental in shaping the future of Linux." Further, they hope to be able to add to and enhance the user and development experience of the open source operating system.
The three other members to join the Linux Foundation specialize in multimedia software (Fluendo), embedded system development (Lineo Solutions), and device-agnostic security (Mocana) but the green giant that is NVIDIA has certainly stolen the show and is the big announcement for them (which isn't a bad thing that they joined, it is kind of a big deal to have them). Amanda McPherson, VP of Marketing and Developer Services for the Linux Foundation wrapped up the press release by saying that all of the new members "represent important areas of the Linux ecosystem and their contributions will immediately help advance the operating system.”
NVIDIA has generally enjoyed good support on the major Linux distributions, but now that they are a member here's hoping they can further improve their Linux graphics card drivers. What is your take on the Linux Foundation's new members, will they make a difference?
Subject: Editorial, General Tech | March 8, 2012 - 04:00 PM | Ken Addison
Tagged: Z77, ssd, podcsat, podcast, msi, Intel, gpu, cpu, asus, amd, 7870, 7850
PC Perspective Podcast #192 - 03/08/2012
Join us this week as we talk about the AMD Radeon HD 7870 and 7850, Z77 Motherboard previews, and a Steam console?
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store
- RSS - Subscribe through your regular
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malvantano
This Podcast is brought to you by
- 0:00:42 Introduction
- 1-888-38-PCPER or email@example.com
- http://twitter.com/ryanshrout and http://twitter.com/pcper
- 0:02:00 Installing Windows 8 Consumer Preview In A VirtualBox Virtual Machine
- 0:05:00 AMD Radeon HD 7870 2GB and HD 7850 2GB Pitcairn Review
- 0:18:30 ASUS Z77 Chipset Motherboard Preview: Formula, Gene, mini-ITX
- 0:24:30 MSI X79A-GD65 (8D) LGA 2011 ATX Motherboard Review
- 0:26:00 Visual Computing Still Decades from Computational Apex
- 0:36:00 This Podcast is brought to you by
, and their all new Sandy Bridge Motherboards!
- 0:37:00 GDC 12: The bigger big picture, Steam Box to be announced?
- 0:40:30 MSI Shows of Next Generation Twin Frozr IV Cards at Cebit
- 0:42:30 Peter Pan presents a stylish mouse at CeBIT; Thermaltake's Level 10 M
- 0:45:45 Apple Launching Quad Core Graphics A5X Powered iPad 3 With Retina Display
- 0:49:00 Hardware / Software Pick of the Week
- 1-888-38-PCPER or firstname.lastname@example.org
- http://twitter.com/ryanshrout and http://twitter.com/pcper
Subject: General Tech | March 8, 2012 - 03:09 PM | Jeremy Hellstrom
Tagged: TSMC, 28nm
The news broke yesterday that TSMC had halted production on 28nm production in mid February for an undisclosed reason. SemiAccurate thought this strange as the only company that is admitting to problems with TSMC's 28nm process is NVIDIA, but at least production was scheduled to restart by the end of March. Today from DigiTimes we hear a more positive story about not only the 28nm but also the 20nm process line as TSMC predicts that the demand for chips from their customers could go as high as 95% of their capacity. Perhaps because of the lack of pressure because of a currently well stocked channel they shut down the line to ensure no preventable problems would prevent them from meeting this high demand?
"Taiwan Semiconductor Manufacturing Company (TSMC) chairman and CEO Morris Chang has commented that development of 20nm and 28nm processes is progressing well. The market believes TSMC's capacity utilization rate in second-quarter 2012 should be close to 95%.
Chang indicated that 28nm processes will likely contribute 10% of 2012 revenues.
Industry sources pointed out that TSMC's capacity utilization of its 28nm and 45nm processes at its 12-inch wafer plant has not decreased and delivery usually takes 8-10 weeks.
Equipment makers also indicated that TSMC has been ordering equipment and installations will be completed in first-half 2012.
Industry sources added that other Taiwan-based IC design houses have been experiencing increasing orders. The market was pessimistic before the Lunar New Year holidays causing inventory levels to be low. This has induced the recent wave of increased orders."
Here is some more Tech News from around the web:
- Stop the wedding! WD, Hitachi GST told to wait a day @ The Register
- Lytro camera @ Engadget
- Micron warms up solid hardness, shoves in PCIe hole @ The Register
- Adaptec trickles RAID RoCket fuel into new Xeons @ The Register
- The five technologies that will transform homes of the future @ Ars Technica
- Nvidia joins AMD and Intel in the Linux Foundation @ The Inquirer
- For Windows 8 Users, Stardock Revives the Start Menu @ Slashdot
- Samsung WB700 Review @ HardwareLOOK
- Air Canada’s GoGo In Flight WiFi Tested; A Turbulence Free Experience @ Hardware Canucks
Subject: General Tech, Graphics Cards, Processors, Mobile | March 8, 2012 - 04:02 AM | Scott Michaud
Tagged: ray tracing, tablet, tablets, knight's ferry, Intel
Intel looks to bring ray-tracing from their Many Integrated Core (Intel MIC) architecture to your tablet… by remotely streaming from a server loaded with one or more Knight’s Ferry cards.
The anticipation of ray-tracing engulfed almost the entirety of 3D video gaming history. The reasonable support of ray-tracing is very seductive for games as it enables easier access to effects such as global illumination, reflections, and so forth. Ray-tracing is well deserved of its status as a buzzword.
Render yourself in what Knight’s Ferry delivered… with scaling linearly and ray-traced Wolfenstein
Screenshot from Intel Blogs.
Obviously Intel would love to make headway into the graphics market. In the past Intel has struggled to put forth an acceptable offering for graphics. It is my personal belief that Intel did not take graphics seriously when they were content selling cheap GPUs to be packed in with PCs. While the short term easy money flowed in, the industry slipped far enough ahead of them that they could not just easily pounce back into contention with a single huge R&D check.
Intel obviously cares about graphics now, and has been relentless at their research into the field. Their CPUs are far ahead of any competition in terms of serial performance -- and power consumption is getting plenty of attention itself.
Intel has long ago acknowledged the importance of massively parallel computing but was never quite able to bring products like Larabee against anything the companies they once ignored could retaliate with. This brings us back to ray-tracing: what is the ultimate advantage of ray-tracing?
Ray-tracing is a dead simple algorithm.
A ray-trace renderer is programmed very simply and elegantly. Effects are often added directly and without much approximation necessary. No hacking around is required in the numerous caveats within graphics APIs in order to get a functional render on screen. If you can keep throwing enough coal on the fire, it will burn without much effort -- so to speak. Intel just needs to put a fast enough processor behind it, and away they go.
Throughout the article, Daniel Pohl has in fact discussed numerous enhancements that they have made to their ray-tracing engine to improve performance. One of the most interesting improvements is their approach to antialiasing. If the rays from two neighboring pixels strike different meshes or strike the same mesh at the point of a sharp change in direction, denoted by color, between pixels then they are flagged for supersampling. The combination of that shortcut with MLAA will also be explored by Intel at some point.
A little behind-the-scenes trickery...
Screenshot from Intel Blogs.
Intel claims that they were able to achieve 20-30 FPS at 1024x600 resolutions streaming from a server with a single Knight’s Ferry card installed to an Intel Atom-based tablet. They were able to scale to within a couple percent of theoretical 8x performance with 8 Knight’s Ferry cards installed.
I very much dislike trusting my content to online streaming services as I am an art nut. I value the preservation of content which just is not possible if you are only able to access it through some remote third party -- can you guess my stance on DRM? That aside, I understand that Intel and others will regularly find ways to push content to where there just should not be enough computational horsepower to accept it.
Ray-tracing might be Intel’s attempt to circumvent all of the years of research that they ignored with conventional real-time rendering technologies. Either way, gaming engines are going the way of simpler rendering algorithms as GPUs become more generalized and less reliant on fixed-function hardware assigned to some arbitrary DirectX or OpenGL specification.
Intel just hopes that they can have a compelling product at that destination whenever the rest of the industry arrives.
Subject: General Tech | March 8, 2012 - 12:38 AM | Tim Verry
Tagged: unreal, udk, samaritan, nvidia, fxaa
Last year we saw Unreal unviel their Samaritan demo which showed off next generation gaming graphics using three NVIDIA 580 GTX graphics cards in SLI. Epic games showed off realistic hair and cloth physics along with improved lighting, shadows, anti-aliasing, and more bokeh effects than gamers could shake a controller at with their Samaritan demo, and I have to say it was pretty impressive stuff a year ago, and it still is today. What makes this round special is that hardware has advanced such that the Samaritan level graphics can be achieved in real time with a single graphics card, a big leap from last year's required three SLI'd NVIDIA GTX 580s!
The Samaritan demo was shown at this years' GDC 2012 (Games Developers Conference) to be running on a single NVIDIA "Kepler" graphics card in real time, which is pretty exciting. Epic did not state any further details on the upcoming NVIDIA graphics card; however, the knowledge that the single GPU was able to pull off what it took three Fermi cards to do certainly holds promise.
According to GeForce; however, it was not merely the NVIDIA Kepler GPU that made the Samaritan demo on a single GPU possible. The article states that it was the inclusion of NVIDIA's method for anti-aliasing known as FXAA, or Fast Approximate Anti-Aliasing that enabled it. Unlike the popular MSAA option employed by (many of) today's games, FXAA uses much less memory, enabling single graphics cards to avoid being bogged down by memory thrashing. They further state that the reason MSAA is not ideal for the Samaritan demo is because the demo uses deferred shading to provide the "complex, realistic lighting effects that would be otherwise impossible using forward rendering," a method employed by many game engines. The downside to the arguably better lighting in the Samaritan demo is that it requires four times as much memory. This is because the GPU RAM needs to hold four samples per pixel, and the workload is magnified four times in areas of the game where there are multiple intersecting pieces of geometry.
FXAA vs MSAA
They go on to state that without AA turned on, the lighting in the Samaritan demo uses approximately 120 MB of GPU RAM, and with 4x MSAA turned on it uses about 500 MB. That's 500 MB of memory dedicated just to lighting when it could be used to hold more of the level and physics, for example and would require a GPU to swap more data that it should have to (using FXAA). They state that FXAA on the other hand, is a shader based AA method that does not require additional memory, making it "much more performance friendly for deferred renderers such as Samaritan."
Without anti-aliasing, the game world would look much more jagged and not realistic. AA seeks to smooth out the jagged edges, and FXAA enabled Epic to run their Samaritan demo on a single next generation NVIDIA graphics card. Pretty impressive if you ask me, and I'm excited to see game developers roll some of the Samaritan graphical effects into their games. Knowing that Epic Game's engine can be run on a single graphics card implies that this future is all that much closer. More information is available here, and if you have not already seen it the Samaritan demo is shown in the video below.
Subject: General Tech, Storage | March 7, 2012 - 01:39 PM | Jeremy Hellstrom
Tagged: western digital, ssd, hitachi, flash, EMC
Hitachi Global Storage Technologies, which was the result of the merger of Hitachi and IBM's HDD businesses, is likely being purchased by Western Digital tomorrow for about $4.3 billion. This makes sense as WD has been using Hitachi GST as a sales partner when providing EMC with high end flash disks. This deal comes on the heels of a major sell, the SSD400S flash disk which uses Intel's 34nm SLC NAND and the SSD400S-B which utilizes the new 25nm NAND developed by Intel. Check out the specifications of the flash drives as well as the new SSD company over at The Register.
"WD is buying Hitachi GST and the acquisition is expected to be formally announced tomorrow with a condition of two years of independence for Hitachi GST - imposed by a Chinese anti-competition regulator. EMC has certified Hitachi GST's SSD400S flash disks for use in its VNX mid-range unified storage arrays, including the all-flash VNX5500-F, so WD will effectively fulfil this deal once the acquisition is announced."
Here is some more Tech News from around the web:
- TSMC suddenly halts 28nm production @ SemiAccurate
- AMD cuts loose GlobalFoundries stake @ The Register
- Peter Molyneux has left Microsoft @ The Inquirer
- Workers can't escape Windows 8 Metro - Microsoft COO @ The Inquirer
- 5 Tips and Tricks for Using Yum @ The Register
- Adobe lobs out Flash update to plug 3D security hole @ The Register
- Six foot speaker shakes buildings to their foundation @ Hack a Da
- Digital Innovations Accessories @ TechwareLabs
- Cebit 2012 HardwareHeaven Coverage
Subject: General Tech | March 7, 2012 - 01:12 PM | Jeremy Hellstrom
Tagged: thermaltake, Level 10 M, gaming mouse
Thermaltake General Manager Peter Pan had the following to say,
"At Thermaltake Group we have always been proud of being a pioneer in design, performance and innovation, we are very excited to launch the Level 10 M Gaming Mouse which is our latest collaboration with DesignworksUSA since the legendary Level 10 Chassis."
The specifications above don't even begin to cover the actual features available on this fully adjustable mouse, with almost every dimension of this mouse can be adjusted for the perfect fit to your hand.
Anyone that already owns a keyboard with LED lighting on it will be delighted to see the way that multicoloured LEDs shine through the grating visible on the top of the mouse and that is not all that grating can do. Take a look below for reassurance that no matter how long you are gaming or working for, your hand is not going to break out in a sweat.
Show off your style at work, LAN parties or just to match the theme of the machine you have at home; this BMW inspired mouse will definitely get you some attention.
Subject: General Tech | March 7, 2012 - 01:30 AM | Tim Verry
Tagged: sales, record, pcga, PC, gaming
In two surprising bits of news, the PC gaming alliance is not only still alive and kicking, but their recent report indicates that the PC games industry saw record sales numbers in 2011. The consortium reported that worldwide PC game sales hit $18.6 billion last year, a year over year increased of 15%. The initial numbers definitely seem to suggest that PC gaming is nowhere near dead.
The PCGA states that the rise in sales is due to increases in the Chinese PC games market and the rise in popularity of Free-to-Play games like TF2, Star Trek: Online, League of Legends, and World of Warcraft (which has a free component), and many others. The integration of some of the most popular Free to Play games into Valve's Steam store certainly didn't hurt either!
Further, the release of several big hit titles including Battlefield 3, Skyrim, Saints Row: The Third, and Portal 2, and Deus Ex: Human Revolution (my pick for PC Per Game of the Year) all contributed to the record sales numbers on the PC. According to the article, Asian publisher Tencent launched League of Legends and is now bringing in 11 million players (though it's likely that not all of those players are active and/or spend money on the service) and will be surpassing Activision as the company making the most money off of PC games. Zynga, the company behind many of the
annoying time sinks popular Facebook games, continued to rake in a boatload of money with 2011 revenue of $1.1 billion.
Not only did the PC gaming alliance report these positive numbers for 2011, but they predict that the PC games industry will continue to grow. As digital distribution systems catch on and broadband connections continue to improve and spread into new areas (though Verizon and AT&T aren't helping matters by stopping further roll outs of FIOS and Uverse), the PCGA predicts that the industry will grow to $25.5 billion, which would be a 37% rise in four years. If the growth rate continues in Asian markets and publishers continue to back down from DRM in favor of producing more titles that people want to buy, their predicted growth is certainly achievable. More information on the PCGA report can be found here along with the full press release here(PDF).