Subject: General Tech | November 4, 2013 - 01:28 PM | Jeremy Hellstrom
Tagged: Intel, linux, open source, Broadwell
Over the weekend 62 patches to the Linux kernel were released, enabling Broadwell GPU support well ahead of the processors scheduled release date. Not only is this great news for open source enthusiasts who appreciate it when large companies like Intel release detailed driver code but also means that Broadwell should function well with Linux on its release date. Phoronix also reports that more code is scheduled to arrive this week to enable other features which are unique to Broadwell, keep your eyes peeled for any specifications we can infer from the code as it becomes available
"While Intel's Broadwell processors won't be launching until 2014 as the successor to Haswell, this weekend the initial open-source Linux GPU kernel driver was published ahead of the Linux 3.13 kernel merge window. The changes are massive and it's looking like the Broadwell graphics improvements will be astonishing and provide significant improvements over Haswell and earlier generations of Intel graphics."
Here is some more Tech News from around the web:
- Netflix starts 4K TV trial ahead of 2014 rollout @ The Inquirer
- Linux 3.12 Released, Linus Proposes Bug Fix-Only 4.0 @ Slashdot
- Google announces another partially fixed security flaw @ The Inquirer
- Google teaches Chrome Canary to sing when it sniffs dodgy downloads @ The Register
- Ding-dong! Who's at the door now with a big wad of cash, BlackBerry? @ The Register
- TRENDnet AC1750 Dual Band Wireless Router @ NikKTech
Subject: General Tech | October 8, 2013 - 07:59 PM | Jeremy Hellstrom
Tagged: amd, dirty pool, linux, open source
Rebranding existing hardware with fancy new model numbers is nothing new to either AMD or NVIDA. Review sites catch on immediately and while we like to see optimizations applied to mature chips we much prefer brand new hardware. When you release the Q-1200000 as the X-2200000 the best you will get from review sites is a recommendation to go with the model that has the lower price, not the higher model number. Most enthusiasts have caught on to the fact that they are the same product; we do not like it but we have come to accept it as common business practice. Certain unintentional consequences from designs we can forgive as long as you admit the issue and work to rectify it, only the intentional limitations are being mentioned in this post.
This is where the problem comes in as it seems that this intentional misleading of customers has created a mindset where it is believed that it is OK to intentionally impose performance limitations on products. Somehow companies have convinced themselves that a customer base who routinely tears apart hardware, uses scanners to see inside actual components and who write their own OSes from scratch (or at least update the kernel) will somehow not be able to discover these limitations. Thus we have yesterday's revelation that NVIDIA has artificially limited the number of screens usable in Linux to three; not because of performance or stability issues but simply because it might provide Linux users with a better experience that Windows users.
Apparently AMD is not to be outdone when it comes to this kind of dirty pool, in their case it is audio that is limited as opposed to video. If you are so uncouth as to use a DVI to HDMI adapter which did not come with your shiny new Radeon then you are not allowed to have audio signals transferred over that HDMI cable on either Windows or Linux. There is a ... shall we say Apple-like hardware check, that Phoronix reported on which will disable the audio output unless a specific EEPROM on your adapter is detected. NVIDIA doesn't sell monitors nor is AMD really in the dongle business but apparently they are willing to police the components you choose to use, though the causes of AMD's decision are not as clear as NVIDIA's for as far as we know Monster Cable does not have the magic EEPROM in their adapters.
If your customers are as talented as your engineers you might not want to listen to your salespeople who tell you that partnerships with other companies are more important than antagonizing your customers by trying to pull a fast one on them. We will find out and it will come back to haunt you. Unless the payoffs you get from your partnerships are more than you make selling to customers in which case you might as well just ignore us.
"For some AMD Radeon graphics cards when using the Catalyst driver, the HDMI audio support isn't enabled unless using the simple DVI to HDMI adapter included with the graphics card itself... If you use another DVI-to-HDMI adapter, it won't work with Catalyst. AMD intentionally implemented checks within their closed-source driver to prevent other adapters from being used, even though they will work just fine."
Here is some more Tech News from around the web:
- The TR Podcast 143: Steamy news from Hawaii
- 4GB DDR3 contract prices to rise by double digit percentage in October @ DigiTimes
- Microsoft unveils enterprise cloud solutions @ DigiTimes
- AMD's SeaMicro: 'We're the mystery vendor behind Verizon's cloud' @ The Register
- Intel HD Sandy/Ivy Bridge Linux Graphics Performance On Ubuntu 13.10 @ Phoronix
- Club3D Radeon R9 280X royalKing 3GB Graphics Card Giveaway @ eTeknix
Subject: General Tech | October 7, 2013 - 01:46 PM | Jeremy Hellstrom
Tagged: nvidia, linux, microsoft, open source
If you haven't heard the accusations flying over the possible scenarios that lead up to Origin PC dropping AMD cards from all their machines you can catch up at The Tech Report. They keep any speculation to a minimum unlike other sites but the key point is the claims of overheating and stability issues, something that apparently only Origin has encountered. If they had stuck with mentioning the frame pacing in Crossfire and 4K/mulitmonitor issue it would be understandable that they not sell AMD cards in systems designed for that usage but dropping them altogether is enough to start rumours and conspiracy theories across the interwebs. Winning a place in the Steam Machine was great for NVIDIA but at no time did they imply that AMD was unworthy, they merely didn't win the contract.
Today some oil was tossed on the fire with the revelation that NVIDIA is specifically limiting the functionality of its hardware on Linux. Just after we praised their release of documentation for Nouveau, their open sourced driver, we find out from a post at The Inquirer that NVIDIA limits the number of monitors used in Linux to three so as not to outdo their functionality in Windows. For a brief moment it seemed that NVIDIA was willing to cooperate with the open source and Linux communities but apparently that moment is all we will have and once again NVIDIA proves that it is willing bow to pressure from Microsoft.
"According to a forum poster at the Nvidia Developer Zone, the v310 version of the drivers for Basemosaic has reduced the number of monitors a user can connect simultaneously to three."
Here is some more Tech News from around the web:
- Cisco, Google and SAP may buy BlackBerry's bits: report @ The Register
- Toshiba unveils 'lightest and thinnest' workstation and a raft of business ultrabooks @ The Inquirer
- Microsoft claims its Surface 2 tablets are 'selling out' without spilling figures @ The Inquirer
- Down with Unicode! Why 16 bits per character is a right pain in the ASCII @ The Register
- NSA using Firefox flaw to snoop on Tor users @ The Register
- LED Costumes and Clothing @ Hack a Day
- Witnessing The League of Legends Season 3 World Championship Finals @ Legit Reviews
- OZONE Gaming Worldwide Joint Giveaway @ NikKTech
Subject: General Tech, Systems | August 3, 2013 - 04:13 AM | Tim Verry
Tagged: SFF, open source hardware, open source, minnowboard, Intel, embedded system, atom
The Intel Open Source Technology Group along with CircuitCo recently launched a new small form factor bare-bones system based on open source hardware and running open source software. The Minnowboard includes a 4.2” x 4.2” motherboard, passively-cooled processor, rich IO, UEFI BIOS, and the Angstrom Linux operating system.
The Minnowboard is powered by a single core Intel Atom E640 processor clocked at 1GHz. It is a 32-bit CPU with HyperThreading and VT-x virtualization support. Other hardware includes an integrated Intel GMA 600 GPU, 1GB of DDR2 memory, and 4MB of flash memory used for motherboard firmware. Storage can be added by plugging a SSD or HDD into the single SATA II 3Gbps port.
The Minnowboard has following IO options:
- 1 x micro SD
- 1 x SATA II 3Gbps
- 2 x USB 2.0 ports
- 1 x micro USB
- 1 x mini USB (serial connection)
- 1 x RJ45 jack (Gigabit Ethernet)
- 2 x 3.5mm audio jacks (line in and line out)
- 1 x HDMI
The Minnowboard also has a GPIO header with 8 buffered GPIO pins, 2 GPIO LEDs, and 4 GPIO switches. As such, the system can be expanded by adding extra open source modules called “Lures.” The board is aimed at developers and embedded system manufacturers. The Minnowboard can be used as the bare system or can be integrated into a case or larger device.
The Minnowboard costs $199 and is available for purchase now from Digi-Key, Farnell (UK), Mouser, and Newark.
Obviously, the Minnowboard is nowhere near as cheap as the $35 Raspberry Pi, but it is running x86 hardware which may make it worth it to some users.
If you are interested, you can learn more about the hardware and get involved with the Minnowboard project over at Minnowboard.org.
Subject: Motherboards | May 15, 2013 - 03:56 AM | Tim Verry
Tagged: server, open source hardware, open source, open compute project, open 3.0, amd
Throughout last year, AMD worked with the Open Compute Foundation to develop open source hardware for servers. The goal of the project was to bring lower-cost, efficient motherboards (compatible with AMD processors) to the server market. Even better, the AMD-compatible hardware is open source which gives companies and OEM/system integrators free reign to modify and build the hardware themselves. The latest iteration of the project is called Open 3.0 and motherboards based on the design(s) are available now from a number of AMD partners.
An AMD Open 3.0 motherboard.
According to a recent AMD press release, Open 3.0 motherboards will be available from AVnet.inc, Hyve, Penguin Computing, and Zt Systems beginning this week. The new motherboards strip out unnecessary and "over-provisioned" hardware to cut down on upfront hardware costs and electrical usage. Open 3.0 uses a base open source motherboard design that can then be further customized to work with a variety of workloads and in various rack/server configurations. Servers based on OPen 3.0 will range from 1U to 3U in size and can slot into standard 19" racks or Open Rack environments. The boards with their dual Opteron 6300-series processors will reportedly be suitable for High Performance Computing (HPC), Virtual Desktop Infrastructure (VDI), Cloud applications, and storage servers. AMD claims that its Open 3.0 motherboards can reduce the Total Cost of Ownership (TCO) of servers by up to 57% in data centers. AMD claims that a server based on Open 3.0 has a TCO of $4,589 while one based on a traditional OEM motherboard costs up to 57% more at $10,669. The AMD-provided example sound nice. Despite the example likely being the best-case-scenario, the idea behind the Open Compute Project and the AMD-specific Open 3.0 hardware does make sense. Customers should see more competition with motherboards that are cheaper to produce and run thanks to the open source nature. Further details on the status of Open 3.0 and the available hardware is being discussed at an invitation-only industry round-table this week between partners, interested enterprise customers, and a number of companies (including AMD, Broadcom, and Quanta).
For the uninitiated, the Open 3.0 hardware features a motherboard that measures 16" x 16.7" and is intended for 1U, 1.5U, 2U, and 3U servers. Each Open 3.0 board includes two AMD Opteron 6300 series processors, 24 DDR3 DIMM slots (12 per CPU, 4 channels with 3 DIMMs each), six SATA ports, 1 managed dual-channel Gigabit Ethernet NIC, up to four PCI-E slots, and a single Mezzanine connector for custom modules (eg. the Mellanox IO or Broadcom Management card). Board IO will include a single serial port and two USB ports.
I'm glad to see AMD's side of the Open Compute Project come to fruition with the company's Open 3.0 hardware. Anything to reduce power usage and hardware cost is welcome in the data center world, and it will be interesting to see what kind of impact the open source hardware will have, especially when it comes to custom designs from system integrators. Intel is also working towards open source server hardware along with Facebook and the Open Compute Project. It is refreshing to see open source gaining traction in this market segment, to say the least.
Subject: General Tech | March 6, 2013 - 01:03 PM | Jeremy Hellstrom
Tagged: zrtp, sip, xmpp, voip, skype, open source, Jitsi, encryption
Jitsi seems to be a lot of things, from an IM Client agglomerator such as Pidgin or Digsby, a combined XMPP and SIP VoIP client to a videoconferencing hub with all traffic encrypted using ZRTP. This Open Source software also claims integration with Microsoft Outlook and Apple Address Book, putting it in competition with Skype on more than one front. Unfortunately it will not connect to all online SIP or XMPP provider but Jitsi does offer an open XMPP bridge to host video calls and as it is open source there is no reason you could not construct your own. With the release of version 2.0 a host of new features and improvements have been added which you can read about by following the links at Slashdot. They have also partnered with the FMJ Project to allow recording of sessions as well as other possible customization thanks to the developers Wiki.
"Among the most prominent new features people will find quality multi-party video conferences for XMPP, audio device hot-plugging, support for Outlook presence and calls, an overhauled user interface and support for the Opus and VP8 audio/video codec. Jitsi has lately shaped into one of the more viable open Skype Alternatives with features such as end-to-end ZRTP encryption for audio and video calls. The 2.0 version has been in the works for almost a year now, so this is an important step for the project."
Here is some more Tech News from around the web:
- Seagate ships 'affordable' desktop hybrid drive @ The Register
- Intel Dishes On What Makes H.265 Worth Waiting For @ Techgage
- Samsung takes a three percent stake in Sharp for $105m @ The Inquirer
- Testing Batteries for Sulfation @ MAKE:Blog
- TP-LINK TL-WA850RE 300Mbps Universal Wireless N Range Extender Review @ Madshrimps
- P-LINK TL-WDR3500 Wireless N600 Router @ Legit Reviews
- Leave Six Strikes Alone! @ Techgage
- Win Phanteks and NZXT hardware @ Kitguru
- Giveaway - GIGABYTE X79S-UP5-WIFI @ Tweaktown
Linaro Forms Linux Networking Group to Collaborate on Open Source Software for ARM Networking Hardware
Subject: General Tech | February 22, 2013 - 02:16 AM | Tim Verry
Tagged: oss, open source, networking, linux networking group, linux, linaro, arm
Linaro, a non-profit engineering group, announced a new collaborative organization called the Linux Networking Group at the Embedded Linux Conference in San Francisco this week. The new group will work on developing open source software to be used with ARM-based hardware in cloud, mobile, and networking industry sectors. Of course, being open source, the software for ARM SoCs will be used with Linux operating systems. One of the Linux Networking Group’s purposes is to develop a new “enhanced core Linux platform” for networking equipment, for example.
The new Linux Networking Group is currently comprised of the following organizations:
- Nokia Siemens Networks
- Texas Instruments
The new cooperative has announced four main goals for 2013:
- "Virtualization support with considerations for real-time performance, I/O optimization, robustness and heterogeneous operating environments on multi-core SoCs.
- Real-time operations and the Linux kernel optimizations for the control and data plane.
- Packet processing optimizations that maximize performance and minimize latency in data flows through the network.
- Dealing with legacy software and mixed-endian issues prevalent in the networking space."
Reportedly, Linaro will have an initial software release within the first half of this year. Further, the organization will follow up with monthly software updates to improve performance and add new features. More collaboration and the furthering of ARM-compatible open source software is always a good thing. It remains to be seen how useful the Linux Networking Group will be in pushing its ARM software goals, but here’s hoping it works out for the best.
The full press release can be found below.
I say let the world go to hell
… but I should always have my tea. (Notes From Underground, 1864)
You can praise video games as art to justify its impact on your life – but do you really consider it art?
Best before the servers are taken down, because you're probably not playing it after.
Art allows the author to express their humanity and permits the user to consider that perspective. We become cultured when we experiment with and to some extent understand difficult human nature problems. Ideas are transmitted about topics which we cannot otherwise understand. We are affected positively as humans in society when these issues are raised in a safe medium.
Video games, unlike most other mediums, encourage the user to coat the creation with their own expressions. The player can influence the content through their dialogue and decision-tree choices. The player can accomplish challenges in their own unique way and talk about it over the water cooler. The player can also embed their own content as a direct form of expression. The medium will also mature as we further learn how to leverage interactivity to open a dialogue for these artistic topics in completely new ways and not necessarily in a single direction.
Consciously or otherwise – users will express themselves.
With all of the potential for art that the medium allows it is a shame that – time and time again – the industry and its users neuter its artistic capabilities in the name of greed, simplicity, or merely fear.
Subject: General Tech | June 21, 2012 - 01:35 PM | Tim Verry
Tagged: rant, optimus, open source, nvidia, linux, linus, drivers
Last week, the founder of Linux – Linus Torvalds – gave a speech at the Aalto Center for Entrepreneurship. The aspect that most people picked up on was a certain disparaging statement towards NVIDIA. Since then, the video has spread rapidly around the Internet with critics for and against the statement. Linus does not believe that NVIDIA is easy to work with regarding Linux support, in short. NVIDIA PR recently responded to his statement in stating that the company is in fact heavily involved with Linux development, albeit mobile kernels.
NVIDIA stated in its PR release that supporting Linux is important to the company and they understand how important a positive Linux experience using NVIDIA hardware is. I don’t think anyone is surprised by that statement, but that was not all they said. The company stated that they are big supporters of the ARM Linux kernel with a claimed second most total lines changed and fourth highest number of changesets in the kernel.
The company uses proprietary drivers, but it does support GeForce, Quadro, and Tesla graphics cards under the Linux operating system. By using a common, proprietary driver, NVIDIA claims same-day support for new graphics cards and OpenGL versions for both Windows and Linux operating systems.
Linus’ rant started when an audience member asked about Optimus support under Linux. On that front, NVIDIA did not have a direct answer – only that when it launched laptops with Optimus, it was only supported on Windows 7. Allegedly, the company is working to make interaction between its drivers and the Bumblebee Open Source Project. The Bumblebee project is working to make Optimus-powered laptops work with Linux operating systems.
What do you think of the two statements by Linus and NVIDIA? Should NVIDIA be held accountable for Optimus support under Linux? Is the company doing enough to support the OS? Or is Linus wrong? Let us know in the comments below!
Personally, as much as I like Linux, I don’t think NVIDIA should have to go out of its way to support Optimus on Linux. At least, not until the Linux OS is the operating system that comes pre-installed on an Optimus notebook. At that point, it would be on NVIDIA to provide support. Until then, they don’t have to support it on aftermarket / third part operating systems. With that said, better Linux support couldn't hurt PR-wise. As far as Linux and NVIDIA working together in a more general sense, I think that the company could certainly do more for Linux on the desktop, especially being a Linux Foundation member, but I don't think they will until it is more financially viable to do so.
The full PR statement is available after the break.
Subject: General Tech | May 30, 2012 - 12:11 PM | Jeremy Hellstrom
Tagged: CUDA, open source, opengl
Hack a Day linked to a program that could be of great use for anyone who manipulates and processes images, or anyone who wants to be able to make fractals very quickly. Utilizing the OpenGL Shader Language Reuben Carter developed a command line tool that processes images using NVIDIA GPUs. As we have talked about in the past on PC Perspective, GPUs are much better at this sort of parallel processing than a traditional CPU or the CPU portion on modern processors. Below is one obvious use of this program, the quick creation of complex fractals but this program can also process pre-exisiting images. Edge detection, colour transforms and perhaps even image recognition tasks can be completed with his software at a much faster speed than CPU bound image manipulation programs. If you are in that field, or looking to decorate your dorm room, you should grab his software via the GitHub link in the article.
"If you ever need to manipulate images really fast, or just want to make some pretty fractals, [Reuben] has just what you need. He developed a neat command line tool to send code to a graphics card and generate images using pixel shaders. Opposed to making these images with a CPU, a GPU processes every pixel in parallel, making image processing much faster."
Here is some more Tech News from around the web:
- Hard disk drive prices quick to rise, slow to fall @ The Register
- Microsoft's New User Agreement Bans Class Action Lawsuits @ NGOHQ
- AIDA64 v2.50 is released @ FinalWire