All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Processors | January 15, 2015 - 03:41 PM | Jeremy Hellstrom
Tagged: Pentium G3258, overclock, Intel
You just don't see CPU overclocking guides much anymore, the process has become much easier over the years as Intel and AMD both now sell unlocked CPUs that they expect you to overclock and the motherboard tools and UEFI interfaces do a lot of the heavy lifting for you now. No longer are you doing calculations for frequency ratios or drawing on your CPU with conductive ink. Overclockers Club is revisiting those heydays with a guide on how to make your $70 3.2GHz Pentium G3258 into a more serious beast with a speed well over 4GHz. The steps for overclocking are not difficult but for those who do not have a background in overclocking CPUs, the verification testing steps they describe will be of great value. If you are already well versed in the ways of MemTest86 and Prime95 then perhaps it will be a nice reminder of the days of the Celeron and the huge increases in frequency that family rewarded the patient overclocker with.
"To reach 4.7GHz was a cinch once I adjusted all the smaller voltage settings. Like all overclockers, it was a journey with many failures along the way. One day it would boot and run Prime95, and the next time Windows would not load. It took a while to sort it out by backing down to 4.5GHz and raising each setting until I settled on the below settings."
Here are some more Processor articles from around the web:
- Pentium J2900 CPU Review @ Hardware Secrets
- Athlon 5150 CPU Review @ Hardware Secrets
- AMD FX-9590 @ Benchmark Reviews
- AMD FX-8320E @ Benchmark Reviews
Subject: General Tech | January 15, 2015 - 02:09 PM | Ken Addison
Tagged: podcast, video, gtx 960, nvidia, maxwell, amd, r9 380x, corsair, carbide, 300R, CES, ces 2015, ECS, Z97-Machine, Intel, crucial
PC Perspective Podcast #332 - 01/15/2015
Join us this week as we discuss GTX 960 and R9 380X Rumors, Corsair Carbide 300R Titanium, and our CES 2015 wrap up
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:11:25
Subject: General Tech | January 15, 2015 - 01:10 PM | Jeremy Hellstrom
Tagged: IBM, mainframe, power8
IBM has just released a new mainframe based on their new 5GHz 22nm Power8 based Processing Units with some models supporting up to 10TB of RAM, the minimum you can run is a mere 64GB. It can not only run IBM's zOS but is also capable of directly supporting Linux and can be managed with a Blade running Windows if you so desire. These fancy looking little mainframes are set up with drawers of either 39 or 42 PUs, so you can upgrade as your usage requires although 2 are actually spares and 6 are System Assist Processors, the remaining PUs can be assigned to varying roles as in previous IBM Z models. These machines are designed to handle large amounts of data traffic, providing real time encryption on up to 2.5 billion transactions per day. The Register feels that the most likely usage scenario will be to provide secure mobile data traffic, something which is certainly needed. You can also glean more information from this blog entry if you are curious about the architecture and capabilities of this mainframe.
"Of course, the proof of the pudding will be in the market, but IBM will be hoping that the billion dollars it's poured into developing the new z13 mainframe will get the big end of town as excited as Big Blue itself is."
Here is some more Tech News from around the web:
- Samsung dismisses 'groundless' talk of $7.5bn BlackBerry buyout @ The Inquirer
- Acer slips Wang Andresen into senior Euro slot @ The Register
- PAPAGO! GoSafe 272 Dashcam GS272-US @ Benchmark Reviews
- BAPCo TabletMark v3 Benchmark Presentation @ Madshrimps
Subject: General Tech | January 15, 2015 - 03:41 AM | Scott Michaud
Tagged: microsoft, windows, extended support
According to Microsoft's lifecycle calendar, Windows 7 left mainstream support on the 13th of January and has entered “Extended Support”. This means that the operating system will still receive security updates, but not non-security ones, and “requests to change product design and features” will not be accepted. While the OS is over five years old, it is still very popular, especially among PC gamers.
My concern is that this occurred while we anticipate major changes to the Windows platform. While I never really expected that Microsoft would release DirectX 12 for Windows, there was still hope that we would see a pre-release or developer build while Windows 7 was still in mainstream support (despite being several driver models behind). Now that the window has closed, so to speak, that hope is diminishing. Windows 8.1, on the other hand, might be okay, but I have no idea why you would want to stick with it over Windows 10, especially if it is a free/cheap update.
Besides DirectX 12, I am also concerned about Microsoft cutting off first-party web browsers at IE11. Sure, it is a much better place to end than IE8 on Windows XP, and the end-user could always install a third-party browser, but it could lead to problems for web developers. It is much easier to say “keep Internet Explorer up to date” (heck, even Microsoft is saying it) than the alternative, “use a different browser”. There are still many features under consideration (Shadow DOM being the most interesting for me) that would be nice to have, and not need to worry about the fraction of a fraction.
But at least it will be kept secure until 2020.
Subject: Storage | January 14, 2015 - 04:38 PM | Jeremy Hellstrom
Tagged: ocz, torture, ARC 100
The Tech Report has already shown a variety of SSDs can survive long after their write life cycle has been exceeded and that some drives can continue past 2 petabytes. Kitguru is performing a very similar test, specifically with OCZ Arc 100 SSD and have now passed the 100TB mark with the five drives they are testing. Not a single one of these consumer level drives have died and only one of the drives has reported an error and that was one single bad block. While they have had problems with a specific controller in the past, they no longer use that controller and claims that all of their drives are tarnished is a bit of an exaggeration. A specific performance was also finally addressed, but they are certainly not the only manufacturer that has needed to be called out to address performance degradation over time. You can see the current results here.
"The drives all passed the warranty figure of 22TB at the close of December – our next test was to get them all past the 100TB mark. Would any fail?"
Here are some more Storage reviews from around the web:
- Silicon Power Slim S80 480GB SSD Review @ NikKTech
- Western Digital My Passport Pro 2 TB Portable (Thunderbolt) @ Tech ARP
- QNAP TS-431 @ Legion Hardware
- Synology DS415play NAS Review @ Madshrimps
Subject: General Tech | January 14, 2015 - 02:24 PM | Jeremy Hellstrom
Tagged: ubisoft, sad, gaming, farcry 4
[H]ard|OCP recently put out a pair of articles covering Far Cry 4, the first of which covered the various new graphics features, many of which are only available to NVIDIA users and others like the Godrays which have such a performance impact on AMD GPUs that they may as well be NVIDIA only. The second will be of more interest to gamers as they benchmark a dozen GPUs, covering NVIDIA from the GTX 750Ti through to the GTX 980 and AMD from the R7 260X through to the R9 290X. They also had a chance to test SLI performance but unfortunately as Ubisoft decided to disable Crossfire completely in the game there could not be any multiple AMD GPU setups tested. Perhaps the most telling conclusion from [H]ard|OCP is also the most obvious, even though this is an evolution of the FarCry3 engine there have been numerous issues with the game since launch and even after six patches major issues with the game and the continued refusal to support Crossfire are hurting this games performance. If you still plan to play the game you can read [H]'s full performance review to see how your GPU should perform in Ubisoft's latest ... release.
"We play Far Cry 4 on no less than twelve different GPUs for this in-depth look at what graphics settings are playable in Far Cry 4. We will talk about playable settings and show apples-to-apples so you know what to expect in this game and what upgrading your video card may do for you in this new game."
Here is some more Tech News from around the web:
- New Baldur’s Gate Set Between 1 And 2 Coming This Year @ Rock, Paper, SHOTGUN
- Ger ‘Alt Of Here With These Witcher 3 System Specs @ Rock, Paper, SHOTGUN
- Skywind Continues To Move Morrowind Into Skyrim @ Rock, Paper, SHOTGUN
- Alien: Isolation Safe Haven DLC launches on Steam @ HEXUS
- Not So Long: Pillars Of Eternity Release Date Is March 26th @ Rock, Paper, SHOTGUN
- It’s Time: Total War – Warhammer Confirmed @ Rock, Paper,SHOTGUN
Subject: General Tech | January 14, 2015 - 12:32 PM | Jeremy Hellstrom
Tagged: history, cpu, errata, dan luu
A question was asked of Dan Luu about what new tricks silicon has learned since the early days of the eighties. The answer covers a gamut of what tools those who work on low level code such as drivers and UEFI/BIOS now have at their disposal. It is far more than just the fact that we have grown from 8 bit to 64 bit or the frequencies possible now that were undreamed of before but delves into the newer features such as out of order instructions and single instruction, multiple data instructions. If you are not familiar with how CPUs and GPGPUs operate at these low levels it is a great jumping off point for you to learn what the features are called and to get a rough idea of what tasks they perform. If you know your silicon through and through it is a nice look back at what has been added in the last 25 years and a reminder of what you had to work without back in the days when flashing a BIOS was a literal thing. You can also check the comments below the links at Slashdot as they are uncharacteristically on topic.
"An article by Dan Luu answers this question and provides a good overview of various cool tricks modern CPUs can perform. The slightly older presentation Compiler++ by Jim Radigan also gives some insight on how C++ translates to modern instruction sets."
Here is some more Tech News from around the web:
- CES 2015: Dell, Lenovo and HP showcase potential of Intel’s 5th-gen Core chips @ The Inquirer
- Insert 'Skeleton Key', unlock Microsoft Active Directory. Simples – hackers @ The Register
- Lego Avengers Assemble to the Helicarrier! @ Hack a Day
- TechwareLabs CES 2015 Event Coverage: Thermaltake
- Toshiba tosses out uber-slim THREE TERABYTE HDD @ The Register
- BlackBerry adopts the iPhone for promotional Twitter campaign @ The Inquirer
- The BenQ W1080ST+ & W1070+ Home Cinema Projector Launch Event @ TechARP
Subject: Graphics Cards | January 14, 2015 - 10:49 AM | Sebastian Peak
Tagged: rumors, NVIDA, leak, gtx 960, gpu, geforce
The GPU news and rumor site VideoCardz.com had yet another post about the GTX 960 yesterday, and this time the site claims they have most of the details about this unreleased GPU with new leaked photos from a forum on the Chinese site PCEVA.
The card is reportedly based on Maxwell GM206, a 1024 CUDA core part recently announced with the introduction of the GTX 965M. Clock speed was not listed but alleged screenshots indicate the sample had a 1228 MHz core and 1291 MHz Boost clock. The site is calling this an overclock, but it's still likely that the core would have a faster clock speed than the GTX 970 and 980.
The card will reportedly feature 2GB of 128-bit GDDR5 memory, though doubtless 4GB variants would likely be available after launch from the various vendors (an important option considering the possibility of the new card natively supporting triple DisplayPort monitors). Performance will clearly be a step down from the initial GTX 900-series offerings as NVIDIA has led with their more performant parts, but the 960 should still be a solid choice for 1080p gaming if these screenshots are real.
The specs as listed on the page at VideoCardz.com are follows (they do not list clock speed):
- 28nm GM206-300 GPU
- 1024 CUDA cores
- 64(?) TMUs
- 32 ROPs
- 1753 MHz memory
- 128-bit memory bus
- 2GB memory size
- 112 GB/s memory bandwidth
- DirectX 11.3/12
- 120W TDP
- 1x 6-pin power connector
- 1x DVI-I, 1x HDMI 2.0, 3x DP
We await official word on pricing and availability for this unreleased GPU.
Subject: General Tech | January 14, 2015 - 04:20 AM | Scott Michaud
Tagged: GTA5, GTA Online, consolitis
The fifth major release of Grand Theft Auto was launched sixteen months ago on the Xbox 360 and PlayStation 3, without the PC. Eventually, Rockstar announced next-gen versions for the Xbox One and PlayStation 4, which explained why the PC was missing before: it was considered a next-gen platform. Those platforms launched a year after the initial release, but the PC was pushed into January 2015. Now it has been pushed again, into late March (the 24th to be precise).
It's not like we're not trying!
Again, I hope that the extra time will be worth it -- and it might be, too. The game was overwhelmingly successful from a sales standpoint, but Grand Theft Auto Online (its multiplayer component) was criticized for a wide range of issues: service connectivity, glitches including loss of characters and progression, and some even claim a lack of content. Maybe, just maybe, it will be polished by the time it gets to us. And hey, Rockstar even claims that it will launch with Heists (which could be considered a running joke in itself).
They also claim that Grand Theft Auto Online for the PC will support 30 players. Nice.
The system specifications were also released, and they're fairly modest (unlike other recent titles). At a minimum, you will need a 64-bit OS with 4GB of memory and 65GB of drive space, which might be a stumbling block for some. Besides that? Core 2 Quad Q6600 and a GeForce 9800 GT. Its recommended specs push the CPU up to an Ivy Bridge Core i5, 8GB of RAM, and a GeForce GTX 660.
It is interesting to see that only quad cores (or higher) are supported, but fairly old ones. Unless something like Far Cry 4 happens, there should be plenty enough performance in a dual-core Pentium Anniversary Edition to keep up. Hopefully Rockstar doesn't error-out machines if they do not detect at least four threads.
Subject: Mobile | January 13, 2015 - 05:49 PM | Jeremy Hellstrom
Tagged: xiaomi, redmi note, android 4.4
We hear a lot about Xiaomi products here in North America but as they are not commonly found for sale by the major providers we do not tend to see them in action. Madshrimps recently reviewed the 5.5" Redmi Note, one of their lower priced offerings. The screen is 720p, powered by a 1.6GHz Snapdragon 400 series CPU and using the Adreno 305 GPU along with 2GB of low power DDR3 and 8 GB of onboard storage. The base OS is a modified version of Android 4.4 and it comes with a wide variety of apps installed, including many now popular fitness apps. If you are curious how a Xiaomi phone priced under $200 without contract performs and just what it comes with, check out the review here.
"The Redmi Note 4G smartphone has impressed us positively because if offers a lot for its value; the incorporated Snapdragon 400 MSM8928 SoC from Qualcomm may not have a very powerful 3D component but it compensates with good CPU performance, quite decent power consumption and thanks to that we did not have the feeling that the smartphone discharges pretty fast right in front of our eyes as we have seen with some MTK6592 units which have reviewed in the past."
Here are some more Mobile articles from around the web:
- Kolina K100 + V6 Smartphone Review @ Madshrimps
- UleFone Be One Smartphone Review @ Madshrimps
- LG G Watch R Smartwatch Review @ TechwareLabs
- Asus Chromebook C300 @ Kitguru
- Cyberpower Fangbook Edge 4K Laptop @ Kitguru
Subject: General Tech | January 13, 2015 - 03:52 PM | Sebastian Peak
Tagged: set-top box, remote access, pc game streaming, nzxt, DOKO
The new DOKO device from NZXT is an interesting spin on the living room streaming box, and it's a lot more than another Netflix player.
So what exactly is it? According to NZXT "DOKO is a low latency (50-80ms), 1080p 30 FPS PC streaming device that brings you the full functionality of your PC, anywhere in your home."
The DOKO provides the interface to remotely connect to computers over your network, providing access to whatever resources you have on your PC. The DOKO has USB ports to connect peripherals and though there is no proprietary hardware required, the company has compiled a “recommend” list of compatible keyboards, mice, and game controllers on their site.
The DOKO interface
And NZXT is making the gaming aspect of the streamer’s capability a big part of the product, though with a 30 FPS limit it isn't as exciting as it could be.
“DOKO brings you unrestricted, latency-free gaming direct to your TV. Experience a new way to play your favorite PC games, with complete access to ALL of them, whether they are from Steam, Origin, Uplay or any other source.”
In-home streaming is already a part of Steam, but the idea of an agnostic gaming experience without a second computer is attractive if it works as well as advertised. The company also points out the advantage of being able to do everything your PC can do… (Uh, we’re talking about spreadsheets, right?)
The DOKO will be available exclusively from NZXT’s online store (sorry, online "Armory") for $99, and will start shipping January 28.
Subject: Graphics Cards | January 13, 2015 - 02:28 PM | Sebastian Peak
Tagged: rumors, nvidia, multi monitor, mini-ITX GPU, leak, HDMI 2.0, gtx 960, gpu, geforce, DisplayPort
The crew at VideoCardz.com have been reporting some GTX 960 sightings lately, and today they've added no less than three new cards from KFA2, the "European premium brand" of Galaxy.
The reported reference design GTX 960 (VideoCardz.com)
Such reports are becoming more common, with the site posting photos that appear to be other vendors' versions of the new GPU here, here, and here. Of note with these new alleged photos on what appears to be a reference design board: no less than three DisplayPort outputs, as well as HDMI 2.0 and DVI:
Reported GTX 960 outputs (VideoCardz.com)
This would be big news for multi-monitor users as it would provide potential support three high-resolution DisplayPort monitors from a single card in a strictly non-gaming environment (unless you happen to enjoy the frame-rates of an oil painting).
The reported mini-ITX GTX 960 (VideoCardz.com)
The other designs shown in the post include a mini-ITX form-factor design still sporting the triple DisplayPorts, HDMI and DVI, and a larger EXOC edition built on a custom PCB.
Reported EXOC GTX 960 (VideoCardz.com)
The EXOC edition apparently drops the multi-DisplayPort option in favor of a second DVI output, leaving just one DisplayPort along with the lone HDMI 2.0 output.
With the GTX 960 leaks coming in daily now it seems likely that we would be hearing something official soon.
Subject: Memory | January 13, 2015 - 01:31 PM | Jeremy Hellstrom
Tagged: ddr4, ddr4-2800, corsair, Corsair Vengeance LPX, X99
With the release of the X99 chipset came the introduction of DDR4, which is not seeing the same uptake as DDR3 did at launch, though it is still selling well. Part of this may be the pricing, DDR3 was expensive when it first launched but even stalwart early adopters may balk at the $340 asking price for the Corsair Vengeance LPX 16GB 2800MHz. The other main reason for the mild reception is the minimal performance gains which DDR4 offers, you can see a slight difference in synthetic benchmarks but when it comes to gameplay the performance increase is minuscule for the price you pay. If you do have an X99 board then this kit is a good choice for you, not only can you often find similar kits on sale for significantly less that $300, Overclockers Club overclocked these DIMMs to 3200MHz at timings of 16-16-16-30. Check out their review here.
"Packed full of promise, the latest modules in the Vengeance series of Corsair's DDR4 memory lineup deliver excellent performance when tweaked to get the tightest timings. Out of the box they come with 16-18-18-36 primary timings using just 1.2v to run the modules. By tweaking the applied voltage a little bit you can get the timings much tighter at the rated speed and even when running at my max overclock of 3200MHz. At this speed I was able to run the timings at 15-15-15-28 2T using over 1.4v applied to the modules."
Here are some more Memory articles from around the web:
- Crucial DDR4-2133 32GB Memory Kit Review @ Hardware Canucks
- Corsair Vengeance LPX DDR4 2666MHZ 16GB Quad Channel Memory Kit @ Bjorn3D
- Corsair DDR4 16GB Vengeance LPX 2800C16 Memory Kit Review @ Madshrimps
- Avexir Core Series 1600MHz CL9 memories with orange LEDs @ HardwareOverclock
Subject: General Tech | January 13, 2015 - 12:59 PM | Jeremy Hellstrom
Tagged: win7, extended support, the cycle of life, inevitable
Sigh, the end draws nigh for that most common of desktop operating systems, Windows 7, has moved into Extended Support. This follows the move at Halloween from an active product to one no longer available but is not the final straw for the OS which is currently scheduled for 2020. The Inquirer quotes a source which places the current market share of Win7 at just over 56% globally, far above the currently selling Win8.1 but this number will slowly begin to fall, likely at a quicker pace than did WinXP's share. When a Windows product reaches Extended Support it still receives security patches and serious bug fixes, albeit at a slower pace than when it is current so don't worry that your Win7 boxen will be dying any time soon but it does make it even more worthwhile to familiarize yourself with Windows 10 as new machines will be running that OS very soon. Drop by The Inquirer for other upcoming dates, such as the final nail in Vista's coffin.
"WINDOWS 7 has reached an important milestone that begins its long, slow descent into obscurity and eventually end of life, where it will doubtless continue to command more market share than its successor."
Here is some more Tech News from around the web:
- The 4 Best New Linux Distributions to Watch in 2015 @ Linux.com
- How To Hijack Your Own Windows System With Bundled Downloads @ Slashdot
- IBM: Hey, Intel and pals. Look on our massive patent pile and despair @ The Register
- Quantum dot materials still an issue, say observers @ DigiTimes
- Making better Li-ion battery membranes @ Nanotechweb
Subject: Graphics Cards | January 13, 2015 - 12:22 PM | Ryan Shrout
Tagged: rumor, radeon, r9 380x, 380x
Spotted over at TechReport.com this morning and sourced from a post at 3dcenter.org, it appears that some additional information about the future Radeon R9 380X is starting to leak out through AMD employee LinkedIn pages.
Ilana Shternshain is a ASIC physical design engineer at AMD with more than 18 years of experience, 7-8 years of that with AMD. Under the background section is the line "Backend engineer and team leader at Intel and AMD, responsible for taping out state of the art products like Intel Pentium Processor with MMX technology and AMD R9 290X and 380X GPUs." A bit further down is an experience listing of the Playstation 4 APU as well as "AMD R9 380X GPUs (largest in “King of the hill” line of products)."
Interesting - though not entirely enlightening. More interesting were the details found on Linglan Zhang's LinkedIn page (since removed):
Developed the world’s first 300W 2.5D discrete GPU SOC using stacked die High Bandwidth Memory and silicon interposer.
Now we have something to work with! A 300 watt TDP would make the R9 380X more power hungry than the current R9 290X Hawaii GPU. High bandwidth memory likely implies memory located on the substrate of the GPU itself, similar to what exists on the Xbox One APU, though configurations could differ in considerable ways. A bit of research on the silicon interposer reveals it as an implementation method for 2.5D chips:
There are two classes of true 3D chips which are being developed today. The first is known as 2½D where a so-called silicon interposer is created. The interposer does not contain any active transistors, only interconnect (and perhaps decoupling capacitors), thus avoiding the issue of threshold shift mentioned above. The chips are attached to the interposer by flipping them so that the active chips do not require any TSVs to be created. True 3D chips have TSVs going through active chips and, in the future, have potential to be stacked several die high (first for low-power memories where the heat and power distribution issues are less critical).
An interposer would allow the GPU and stacked die memory to be built on different process technology, for example, but could also make the chips more fragile during final assembly. Obviously there a lot more questions than answers based on these rumors sourced from LinkedIn, but it's interesting to attempt to gauge where AMD is headed in its continued quest to take back market share from NVIDIA.
Subject: Cases and Cooling | January 12, 2015 - 02:21 PM | Jeremy Hellstrom
Tagged: swiftech, H220-X, AIO, watercooler
Swiftech has taken a new generation of their MCR radiators and paired it with the tried and tested Apogee XL waterblock in their new AIO watercooler, the H220-X. At ~$170 it is more expensive than many competitors solutions and so will need to perform at higher levels in order to get a recommendation from [H]ard|OCP. The cooler does offer some extras which the competition does not which helps justify the pricing, you can power up to eight fans with the included adapter which makes sense as the modular design of the H220-X allows you to add to the cooling loop if you so desire. The performance was quite good especially when you consider how quiet the cooler operates at full load but as [H] mentions in their conclusion, the price is quite high and they saw the MSRP at a much lower $130.
"Swiftech is a standard name in the computer hardware enthusiast arena. Today we review its answer to an enthusiast All-In-One CPU cooler. As you might guess it is strong on hardware, design, and purpose. The H220-X CPU Liquid Cooling Kit focuses on little to no noise while providing excellent cooling."
Here are some more Cases & Cooling reviews from around the web:
- Fractal Design Kelvin S24 @ techPowerUp
- Noctua NF-A Fan Series Features Roundup @ Neoseeker
- Reeven Six-Eyes II Fan Controller Review @ OCC
- New 92mm-fan Tower Coolers from Noctua @ Silent PC Review
- SilentiumPC Grandis XE1236 @ techPowerUp
- Noctua NH-U9S CPU Cooler Review: Undersized but Over-performs @ Modders-Inc
- Corsair Graphite 780T Full Tower Case Review @ Neoseeker
- Thermaltake Core V31 Midi Tower Review @ NikKTech
- Lian Li PC-O5S @ Legion Hardware
- Cooler Master Silencio 652S @ Benchmark Reviews
- Phanteks Enthoo Evolv @ techPowerUp
- Thermaltake Core V21 MATX Case @ Modders-Inc
Subject: General Tech | January 12, 2015 - 01:29 PM | Jeremy Hellstrom
Tagged: ultrasound, opencl, hd 7850
The new bk3000 Ultrasound System from Analogic will use an embedded HD7850 and OpenCL to triple the quality of the information the ultrasound reveals. This will allow ultrasounds to reveal anatomical detail and micro-vascularization that was not available with previous ultrasound technology and could even enable Gamegaters to locate their own heads with the use of the E14C4t transducer. The most familiar usage of ultrasound is for displaying a fetus in utero but there are far more medical uses for this type of (mostly) non-invasive scan and the increase in detail and the transformation abilities that Open CL brings will not only make it more effective but could expand the usefulness of ultrasounds as a diagnostic tool. As we at PC Perspective continue to age we are very appreciative of advances such as this, especially if we can get a split screen that allows us to do a little light gaming while the doctors poke and prod!
SUNNYVALE, Calif. — Jan. 12, 2015 — AMD (NASDAQ:AMD) today announced that the AMD Embedded Radeon HD 7850 GPU is enabling cutting-edge application performance for the BK Ultrasound, powered by Analogic, bk3000 ultrasound system. Analogic is a leader in developing healthcare and security technology solutions to advance the practice of medicine to save lives.
“The AMD Embedded Radeon HD 7850 GPU with OpenCL provides a powerful and efficient pairing,” said Cameron Swen, segment marketing manager, medical applications, AMD Embedded Solutions. “This product is yet another proof point to AMD’s dedication to the healthcare segment through its technology, which helps facilitate crisp, detailed medical image visualization and other advanced graphics-driven capabilities, helping doctors provide improved care for patients.”
Analogic used OpenCL standard to gain access to the GPU for general-purpose computing, referred to as “GPGPU,” delivering exceptional performance and offering system and development cost reduction through cross-platform portability. As a result of using AMD GPU technology, Analogic achieved a 3x improvement in the amount of information in each ultrasound image and reduced time from capture to presentation. Traditional FPGAs and DSPs create a fixed, inflexible implementation that requires custom software targeted at specific hardware. Going to a software-based solution using OpenCL helps to further lower the development cost and provides improved long term value since the software can be used across product lines and through generation shifts.
“It was a critical design goal for us to implement a platform that delivered exceptional performance,” said Jacques Coumans, chief marketing and scientific officer, Analogic. “After reviewing the options available, we chose the AMD Embedded Radeon HD 7850 GPU for its excellent quality and scalability. The bk3000 ultrasound system, powered by AMD embedded graphics technology, delivers exceptional speed and image fidelity, which allows clinicians to identify anatomy and flow dynamics deeper in challenging patients.”
The AMD Embedded Radeon HD 7850 is based on AMD’s award-winning Graphics Core Next (GCN) architecture to advance the visual growth and parallel processing capabilities of embedded applications. In addition to ultrasound, other applications for GPGPU include some of the most complex parallel applications such as terrain and weather mapping, facial and gesture recognition, and biometric and DNA analysis.
The new Analogic bk3000 ultrasound system is targeted for urology, surgery, general imaging, and procedure guidance applications and is commercially available in key markets worldwide.
Subject: General Tech | January 12, 2015 - 12:42 PM | Jeremy Hellstrom
Tagged: freesync, amd
The Tech Report published a video of Freesync in action using a camera which records at 240 fps. The subject material is running at less than the 60Hz that most of our monitors use which means that you can actually see what it does for you. Watching the video at 60Hz you can see the tearing on the blades of the windmill as the actual frame rate of the render is 44 - 45Hz while when Freesync is active the matched frequencies do not cause any tearing. The demonstration shows how Freesync can benefit lower end systems that are not going to push a 144Hz monitor to the limits, if you can only manage 40-50 fps in a game Freesync is going to make it much easier on your eyes. You can catch our latest coverage of Freesync here.
"We've been hearing about FreeSync, AMD's answer to Nvidia's G-Sync variable refresh display tech, for just over a year now. This week at CES, we finally got a chance to see FreeSync in action, and we used that opportunity to shoot some enlightening 240-FPS footage. We were able to find out some new specifics from AMD, as well."
Here is some more Tech News from around the web:
- TR's big CES 2015 digest
- Techgage’s Best Of CES 2015
- Kingston uses Marvell controllers for PCIe SSD @ DigiTimes
- CES 2015: Sony Smartwatch 3 Steel hands-on @ The Inquirer
- Nokia N1 price, release date and specs @ The Inquirer
- Got a 4King big TV? Ready to stream lots of awesome video? Yeah, about that… @ The Register
- ASUS router-popping exploit on the loose @ The Register
Subject: Cases and Cooling | January 12, 2015 - 11:22 AM | Sebastian Peak
Tagged: phanteks, mini-itx, micro-atx, Enthoo Mini XL, enclosure, dual-motherboard, cases
Phanteks has introduced a computer enclosure with a new form-factor they are calling “super micro ATX”, a large alternative to standard mATX designs that has the advantage of supporting two complete systems within a single case.
The second motherboard is supported via their ITX upgrade kit, and as the name indicates the second system must be built on the mini-ITX platform. While this might appeal to a very small market there is a need for running discrete systems for some users, and this design is certainly an interesting alternative to running two boxes. How it handles heat dissipation is a good question, but considering the “extreme cooling” capacity of the case - with up to 14x 120mm or 8x 140mm fan mounts - there would be plenty of room for a pair of AIO solutions to keep the CPU heat outside of the enclosure.
The mini-ITX board is installed at the top (Image credit: cowcotland.com)
The enclosure’s dimensions are (WxHxD) 260mm x 550mm x 480mm (10.24” x 21.65” x 18.90”), and the feature list includes:
- Dual removable hard drive cages
- 2x removable Drop-N-Lock SSD brackets
- Fully equipped with dustfilters (1x top, 1x front, 2x bottom)
- Removable top panel for easy fan installation and dust filter cleaning
- Compartment for fan installation in top panel
- Clean cable management using Phanteks' preinstalled Hoop-N-Loop cable ties
- Mod friendly structure uses screws NOT rivets
- 10 color abient lighting controller
- 2x USB 3.0, microphone, 3.5mm audio jack
Two backplates! (Image credit: cowcotland.com)
For full specs see the product page at the Phanteks site. Pricing is not listed and searching for the product at the usual places doesn’t turn up any listings as of this morning.
Subject: General Tech | January 11, 2015 - 03:08 PM | Sebastian Peak
Tagged: wearables, SoC, smartwatch, Intel, ces 2015, CES, arm
Wearable tech shown at this year's CES by Intel included the Intel MICA and Basis PEAK wearables, but a blog post from ARM is reporting that a pair of these devices are powered by an ARM SoC.
The Intel MICA (Image credit: Intel)
ARM has posted pictures of teardowns from different wearable products, highlighting their presence in these new devices. The pictures we have taken from ARM's blog post show that it is not Intel at the heart of the two particular models we have listed below.
First is the Basis PEAK, and it actually makes a lot of sense that this product would have an ARM SoC considering Intel's aquisition of Basis occurred late in 2014, likely after the development of the PEAK had been completed.
The Basis PEAK (Image credits: Basis, ARM)
Of course it is likely that Intel has plans to integrate their own mobile chips into future versions of wearable products like the PEAK.
Of some interest however is the SoC within their own MICA luxury wearable.
The Intel MICA (Image credits: Intel, ARM)
For now, ARM is the industry standard for mobile devices and they are quick to point this out in their their blog post, writing "it’s important to remember that only ARM and its partners can meet the diversity requirements and fuel innovation in this space". Intel seems to be playing the "partner" role for now, though not exclusively as the company's mobile technology is powering the newest ASUS ZenFone, for instance.