Subject: General Tech | January 7, 2016 - 08:13 PM | Scott Michaud
Tagged: square enix, shinra, cloud computing
Shinra Technologies was a cloud computing service from Square Enix. Unlike OnLive and PlayStation Now, it wasn't intending to offer existing titles to incompatible devices by hooking input and returning video. They wanted to use this service to create new titles, even with third-parties like Ubisoft, that offload big computational elements to their servers. Those should be latency-insensitive parts, but details would be game dependent of course. This is similar to Microsoft's Xbox program, which is said to power the upcoming game for the Xbox One, Crackdown 3.
Image Credit: Final Fantasy Wikia
We now say was, because Square Enix killed the program. It was created as a subsidiary to keep it separate from the game development division, which might have given some comfort to third-party developers. It also allowed them to secure funding for the program independently, except that investors did not sign on. Without external capital, Square Enix dissolved the division at a loss of about $17 million USD.
Subject: General Tech | March 27, 2014 - 12:49 AM | Tim Verry
Tagged: remote graphics, nvidia, GTC 2014, gpgpu, emerging companies summit, ecs 2014, cloud computing
NVIDIA started the Emerging Companies Summit six years ago, and since then the event has grown in size and scope to identify and support those technology companies tha leverage (or plan to leverage) GPGPU computing to deliver innovative products. The ECS continues to be a platform for new startups to showcase their work at the annual GPU Technology Conference. NVIDIA provides support in the form of legal, developmental, and co-marketing to the companies featured at ECS.
There was an interesting twist this year though in the form of the Early Start Challenge. This is a new aspect to ECS in addition to the ‘One to Watch’ award. I attended the Emerging Companies Summit again this year and managed to snag some photos and participate in the Early Start Challenge (disclosure: i voted for Audiostream TV).
The 12 Early Start Challenge contestants take the stage at once to await the vote tally.
During the challenge, 12 selected startup companies were each given eight minutes on stage to pitch their company and why their innovations were deserving of the $100,000 grand prize. The on stage time was divided into a four minute presentation and a four minute Q&A session with the panel of judges (this year the audience was not part of the Q&A session at ECS unlike last year due to time constraints).
After all 12 companies had their chance on stage, the panel of judges and the audience submitted their votes for the most innovative startup. The panel of judges included:
- Scott Budman Business & Technology Reporter, NBC
- Jeff Herbst Vice President of Business Development, NVIDIA
- Jens Hortsmann Executive Producer & Managing Partner, Crestlight Venture Productions
- Pat Moorhead President & Principal Analyst, Moor Insights & Strategy
- Bill Reichert Managing Director, Garage Technology Ventures
The companies participating in the challenge include Okam Studio, MyCloud3D, Global Valuation, Brytlyt, Clarifai, Aerys, oMobio, ShiVa Technologies, IGI Technologies, Map-D, Scalable Graphics, and AudioStream TV. The companies are involved in machine learning, deep neural networks, computer vision, remote graphics, real time visualization, gaming, and big data analytics.
After all the votes were tallied, Map-D was revealed to be the winner and received a check for $100,000 from NVIDIA Vice President of Business Development Jeff Herbst.
Jeff Herbst awarding Map-D's CEO with the Early Start Challenge grand prize check. From left to right: Scott Budman, Jeff Herbst, and Thomas Graham.
Map-D is a company that specializes in a scaleable in-memory GPU database that promises millisecond queries directly from GPU memory (with GPU memory bandwidth being the bottleneck) and very fast database inserts. The company is working with Facebook and PayPal to analyze data. In the case of Facebook, Map-D is being used to analyze status updates in real time to identify malicious behavior. The software can be scaled across eight NVIDIA Tesla cards to analyze a billion Twitter tweets in real time.
It is specialized software, but extremely useful within its niche. Hopefully the company puts the prize money to good use in furthering its GPGPU endeavors. Although there was only a single grand prize winner, I found all the presentations interesting and look forward to seeing where they go from here.
Subject: General Tech | May 2, 2013 - 09:07 AM | Tim Verry
Tagged: windows, thin client, remote desktop, mohoro, microsoft, cloud computing, azure
Microsoft may be working on its own cloud-based desktop service according to sources speaking with ZDNet’s Mary Jo Foley. The rumored service codenamed “Mohoro” would build the Windows desktop SaaS (Software as a Service) solution on top of the company’s Windows Azure cloud computing platform. With Mohoro, Microsoft would provide Azure virtual machines running the Windows operating system. Users would then be able to remote into the desktop on any Internet connected computer or mobile device (with remote desktop support) and get access to their own desktop and applications.
The Windows desktop... coming soon to a cloud near you?
Windows Azure users can already run virtual machines with Linux or Windows OSes, but in the case of Windows Microsoft only allows server versions to be run. Incensing restrictions prevent users from loading consumer operating systems such as Windows XP, 7 or 8 onto the virtual machines. The rumored Mohoro service would apparently relax the licensing restrictions and allow businesses or consumers to deploy client operating systems running on the Azure VMs. It would basically take the need for enterprises to run their own hardware and move it to “the cloud” behind a Microsoft-run subscription service.
It is an interesting idea that I could see universities and businesses looking into. The Azure platform is actually pretty good, from what little testing I've done on it. However, I think that for many consumers a local install is preferable. Although syncing applications and files can be a pain if you have multiple machines, you retain control of your data and are not bound to needing an always-on Internet connection to access that data and run applications. Further, latency issues and bandwidth caps with home Internet connections make a paid-for Azure desktop less appealing to home users. I think Microsoft would have a hard-enough time selling users a subsciption for a local/traditional Windows installation, much less a subscription for an OS requiring an always-on Internet connection to use their computer.
Would you use an Azure-powered desktop as your main OS?
GTC 2013: Cortexica Vision Systems Talks About the Future of Image Recognition During the Emerging Companies Summit
Subject: General Tech, Graphics Cards | March 21, 2013 - 01:44 AM | Tim Verry
Tagged: video fingerprinting, image recognition, GTC 2013, gpgpu, cortexica, cloud computing
The Emerging Companies Summit is an series of sessions at NVIDIA's GPU Technology Conference (GTC) that gives the floor to CEOs from several up-and-coming technology startups. Earlier today, the CEO of Cortexica Vision Systems took the stage to talk briefly about the company's products and future direction, and to answer questions from a panel of industry experts.
If you tuned into NVIDIA's keynote presentation yesterday, you may have noticed the company showing off a new image recognition technology. That technology is being developed by a company called Cortexica Vision Systems. While it cannot perform facial recognition, it is capable of identifying everything else, according the company's CEO Ian McCready. Currently, Cortexica is employing a cluster of approximately 70 NVIDIA graphics cards, but it is capable of scaling beyond that. Mcready estimates that about 100 GPUs and a CPU would be required by a company like eBay, should they want to implement Cortexica's image recognition technology in-house.
The Cortexica technology uses images captured by a camera (such as the one in your smartphone), which is then sent to Cortexica's servers for processing. The GPUs in the Cortexica cluster handle the fingerprint creation task while the CPU does the actual lookup in the database of known fingerprints to either find an exact match, or return similar image results. According to Cortexica, the fingerprint creation takes only 100ms, though as more powerful GPUs make it into mobile devices, it may be possible to do the fingerprint creation on the device itself, reducing the time between taking a photo and getting relevant results back.
The image recognition technology is currently being used by Ebay Motors in the US, UK, and Germany. Cortexica hopes to find a home with many of the fashion companies that would use the technology to allow people to identify and ultimately purchase clothing they take photos of on television or in public. The technology can also perform 360-degree object recognition, identify logos that are as small as .4% of the screen, and identify videos. In the future Cortexica hopes to reduce latency, improve recognition accuracy, and add more search categories. Cortexica is also working on enabling an "always on" mobile device that will constantly be indentifying everything around it, which is both cool and a bit creepy. With mobile chips like Logan and Parker coming in the future, Cortexica hopes to be able to do on-device image recognition, which would greatly reduce latency and allow the use of the recognition technology while not connected to the internet.
The number of photos taken is growing rapidly, where as many as 10% of all photos stored "in the cloud" were taken last year alone. Even Facebook, with it's massive data centers is moving to a cold-storage approach to save money on electricity costs of storing and serving up those photos. And while some of these photos have relevant meta data, the majority of photos taken do not, and Cortexica claims that its technology can be used to get around that issue, but identifying photos as well as finding similar photos using its algorithms.
Stay tuned to PC Perspective for more GTC coverage!
Additional slides are available after the break:
Subject: General Tech | February 22, 2013 - 12:31 PM | Tim Verry
Tagged: servers, facebook, exabyte, data centers, cold storage, cloud computing
Facebook is planning to construct a new cold storage facility to house archived and less-frequently-used media files. The new data center will reside in a new 62,000 sq. ft. building on the company's existing 127-acre property in Prineview, Oregon.
As cold storage, the data center will house servers with up to 3 Exabytes of total data capacity. The machines will be in a sleep state the majority of the time, but will be automatically turned on to serve up media files when accessed on the social network. Because the servers are normally in a lower-power sleep state, there will be a slight delay when users request files. According to Oregon Live, Facebook has stated that the delay will be as much as a couple of seconds and as little as several milliseconds.
The new cold storage facility will enable Facebook to save a great deal on electrical usage and hardware wear and tear (though primarily power bill savings). The company claims that its users upload 350 million photos each day, but that 82% of the social networking site's traffic focuses on a mere 8% of available photos.
Err, not quite the cold storage Facebook has in mind...
Considering Facebook's existing Prineview data center used a whopping 71 million Kilowatts of power in the first 9 months, moving to a new cold storage system for infrequently accessed files is an excellent idea. The photos will still be available, but Facebook will save big on the power bill--a fair compromise for retaining all of those lolcat and meme photos, i think.
The new data center will be rolled out in three phases, each measuring 16,000 sq. ft. in the Prineview facility. The first phase of cold storage servers should be up and running by Q4 2013. There is no estimate on the power savings, but it will be interesting to see how beneficial it will be--and whether other cloud service providers will adopt similar policies.
Also read: Amazon Glacier offers cheap long-term storage.
Subject: Editorial, General Tech, Graphics Cards, Processors | May 19, 2012 - 08:52 PM | Scott Michaud
Tagged: ultrabook, trinity, cloud computing, cloud, amd
Bloomberg Businessweek reports AMD CEO Rory Read claims that his company will produce chips which are suited for consumer needs and not to crunch larger and larger bundles of information. They also like eating Intel’s bacon -- the question: is it from a pig or a turkey?
Read believes there is “enough processing power on every laptop on the planet today”.
The argument revolves around the shift to the cloud, as usual. It is very alluring to shift focus from the instrument to the data itself. More enticing: discussing how the instruments change to suit that need; this is especially true if you develop instruments and yearn to shift anyway.
Don’t question the bacon…
AMD has been trusting that their processors will be good enough and their products will differentiate in other ways such as with graphics capabilities which they claim will be more important for cloud services. AMD hopes that their newer laptops will steal some bacon from Intel and their ultrabook initiative.
The main problem with the cloud is that it is mostly something that people feel that they want rather than actually do. They believe they want their content controlled by a company for them until it becomes inaccessible temporarily or permanently. They believe they want their information accessible in online services but then freak out about the privacy implications of it.
The public appeal of the cloud is that it lets you feel as though you can focus on the content rather than the medium. The problem is that you do not have fewer distractions from your content -- just different ones -- and they rear their head once or twice in isolation of each other. You experience a privacy concern here and an incompatibility or licensing issue there. For some problems and for some people it makes more sense to control your own data. It will continue to be important to serve that market.
And if crunching ends up being necessary for the future it looks like Intel will be a little lonely at the top.
Subject: General Tech | June 12, 2011 - 02:36 PM | Tim Verry
Tagged: networking, dell, cloud computing
A recent survey conducted during the first two days of the Cloud Expo by Marketing Solutions and sponsored by Dell suggests that IT professionals believe that their less technical CEOs believe cloud computing to be a "fad" that will soon pass. On the other hand, IT departments see the opportunities and potential of the technology. This gap between the two professions, according to Dell, lies in "the tendency of some enthusiasts to overhype the cloud and its capacity for radical change." Especially with a complex and still evolving technology like cloud computing, CEOs are less likely to see the potential benefits and moreso the obstacles and cost to adopt the methods.
The study surveyed 223 respondents from various industries (excluding technology providers), and found that the attitudes of IT professionals and what they felt their respective CEOs' attitudes were regarding "the cloud" were rather different. The pie graphs in figure 1 below illustrate the gap between the two professions mentioned earlier. Where 47% of those in IT see cloud computing as a natural evolution of the trend towards remote networks and virtualization, only 26% of IT believed that CEOs agreed. Also, while 37% of IT professions stated that cloud computing is a new way to think about their function in IT, "37 percent deemed their business leaders mostly likely to describe the cloud as having “immense potential,” contrasted with only 22 percent of the IT pros who said that was their own top descriptor."
Further, the survey examined what both IT professionals and CEOs believed to be obstacles in the way of adopting cloud computing. On the IT professionals' front, 57% believed data security to be the biggest issue, 32% stated industry compliance and governance as the largest obstacle, and 27% thought disaster recovery options to be the most important barrier, contrasted with 51%, 30%, and 22% of CEOs. This comparison can be seen in figure 2 below.
While the survey has handily indicated that enterprises' IT departments are the most comfortable with the idea of adopting cloud computing, other areas of the business could greatly benefit from the technology but are much more opposed to the technology. As seen in figure 3, 66% of IT departments are willing to advocate for cloud computing, only 13% of Research and Development, 13% of Strategy and Business Development, and a mere 5% of Supply Chain Management departments feel that they would move to cloud computing and benefit from the technology.
Dell stated that IT may be able to help in many more functions and departments by advocating for and implementing cloud computing strategies in information-gathering and data-analyzation departments. In doing so, IT could likely benefit the entire company and further educate their CEOs in cloud computing's usefulness to close the gap between the IT professionals' and CEO's beliefs.
You can read more about the Dell study here. How do you feel about cloud computing?