Basemark Releases Basemark Web 3.0 with WebGL 2.0

Subject: General Tech | June 10, 2016 - 01:50 AM |
Tagged: Basemark, webgl, webgl2

Basemark has just released Basemark Web 3.0, which includes WebGL 2.0 tests for supporting browsers. No browsers support the standard by default yet, although it can be enabled on Firefox and Chrome with a command-line flag.

basemark-2016-webgl2.png

WebGL 1.0 has become ubiquitous, but it is based on quite an old version of OpenGL. OpenGL ES 2.0 was specified all the way back in March 2007. While it simplified development by forcing everyone down a programmable shader pipeline, it has quite a few limitations. OpenGL ES 3.0 remedied many of these, such as allowing multiple render targets and texture compression. OpenGL ES 3.1 added compute shaders, which brings us pretty much to today. In fact, Vulkan targets OpenGL ES 3.1 hardware (should the hardware vendor provide a driver).

WebGL targeted OpenGL ES 2.0. WebGL 2 targets OpenGL ES 3.0.

Of course, this means that the WebGL 2.0 base standard does not support compute shaders, which is a bit of a drag. It's something that they really want to incorporate, though, but they still can't seem to decide whether it will align with a new version of WebGL (such as WebGL 2.1) or be incorporated in a multi-vendor extension.

So where are we today?

Well, WebGL 2.0 is still a little ways off from being everywhere. As we mentioned, only Firefox and Chrome support the standard, although WebKit is working on it, too. Microsoft has WebGL 2.0 listed as “Under Consideration” with a “Roadmap Priority” of Medium, “Development is likely for a future release.” One major hold up was its shader support. Again, OpenGL ES 3.0 shaders are much more complex than OpenGL ES 2.0 ones, and many WebGL browsers convert OpenGL ES 2.0 shaders to HLSL for DirectX on Windows. This circumvents lackluster graphics drivers, and it adds an extra, huge layer of complexity for someone who wants to write malware. It's not sufficient to know of a driver bug with a specific shader string -- you need to trick the transpiler into outputting it, too.

But, again, we're slowly inching our way there.

Source: Basemark

Microsoft Open-Sources Their WebGL Implementation

Subject: General Tech | June 9, 2016 - 01:42 AM |
Tagged: webgl, microsoft

Well that's something I never expected to write. It turns out that Microsoft has open-sourced a small portion of their Edge web browser. This is the part that binds OpenGL ES 2.0 functionality, implemented atop Direct3D in Edge, to JavaScript for websites to directly interact with the user's GPU (as opposed to hardware-accelerated CSS effects for instance).

Websites can use WebGL to share 3D objects in an interactive way, have interesting backgrounds and decorations, or even render a video game.

trident-fork.jpg

This is not an open-source build of Microsoft Edge, though. It doesn't have the project files to actually be built into something useful. Microsoft intends for it to be reference, at least for now they say. If you are interested in using or contributing to this project for some reason, their GitHub readme file asks you to contact them. As for me? I just think it's neat.

Legend of Zelda in WebGL Voxels

Subject: General Tech | April 4, 2016 - 01:30 PM |
Tagged: zelda, webgl, Nintendo

Before it invariably gets taken offline, you might want to check out a remake of the original Legend of Zelda. It's not just a straight port of the original, though. Its pixel art assets were remade in voxels, which are rendered in WebGL at an angle that's similar to what the original pixel art implies. Original NES controls are overlaid on the screen, which is useful for multi-touch, but keyboard also works.

nintendo-2016-zeldaunofficialvoxel.png

Most of the game is plugin-free and running in the browser. The only thing that requires plug-in support is audio, and it doesn't play nice with click-to-activate. It would have been nice for them to implement it in WebAudio API, and implement Gamepad API while they're at it, but who am I to criticize a passion project that will likely be challenged by Nintendo in a handful of days?

I'm not sure how complete the game is. They seem to imply that all eight dungeons are available, but I haven't had a chance to check.

WebGL2 Is On Its Way

Subject: General Tech | December 31, 2015 - 10:25 PM |
Tagged: webgl2, webgl, mozilla, firefox

The Khronos Group created WebGL to bring a GPU-accelerated platform to web browsers. With a few minor differences, it is basically JavaScript bindings for OpenGL ES 2.0. It also created a few standards in JavaScript itself to support things like raw buffers of data that could be assigned types in an unmanaged way. Basically every latest-version web browser supports it these days, and we're starting to see it used in interesting ways.

webgl2-2015-particles.jpg

The next step is WebGL2. OpenGL ES 3.0 adds a bunch of new features that are sorely needed for modern games and applications. For instance, it allows drawing to multiple render targets, which is very useful for virtual cameras in video games (although the original WebGL software could access this as an optional extension when supported). The addition of “Uniform Buffer Objects” is a better example. This allows you to store a bunch of data, like view transformation matrices, as a single buffer that can be bound to multiple applications, rather than binding them one at a time to every draw that needs them.

It's hard to describe, but demos speak a thousand words.

The news today is that Mozilla Nightly now ships with WebGL2 enabled by default. It was previously hidden, disabled by default, behind an option in the browser. This doesn't seem like a big deal, but one of the largest hurdles to WebGL2 is how the browsers actually implement it. The shading language in WebGL was simple enough that most browsers convert it to DirectX HLSL on Windows. This is said to have the added advantage of obfuscating the ability to write malicious code, since developers never directly writes what's executed. GLSL in OpenGL ES 3.0 is much more difficult. I'm not sure whether the browsers will begin to trust OpenGL ES 3.0 drivers directly, or if they finally updated the GLSL translator, but supported implementations means that something was fixed.

Unfortunately, OpenGL compute shaders are not supported in WebGL2. That said, the biggest hurdle is, again, to get WebGL2 working at all. From my talks with browser vendors over the last year or so, it sounds like features (including compute shaders) should start flying in easily once this hurdle is cleared. From there, GPGPU in a website should be much more straightforward.

WebGL Leaves "Preview" with Unity 5.3

Subject: General Tech | December 8, 2015 - 07:30 AM |
Tagged: webgl, Unity

WebGL is a Web standard that allows issuing OpenGL ES 2.0-based instructions to compatible graphics cards, which is just about everything today. It has programmable vertex and fragment (pixel) shaders with a decent amount of flexibility. Engines like Unity have been looking toward using this technology as a compile target, because Web browsers are ubiquitous, relatively user friendly, and based on standards that anyone could implement should a work of art benefit from preservation.

Heroes-Of-Paragon-iOS-Android-02.jpg

Image Credit: Mozilla

Until Unity 5.3, this feature was in “preview” levels of support. This upcoming release, scheduled for today according to their roadmap, drops this moniker. It is now a build target with official support.

To run WebGL applications that are built in Unity, the vast majority of features target recent versions of Firefox, Chrome, and Edge for Windows 10 Version 1511. (The November Update for Windows 10 added the ability to lock the mouse cursor, which is obviously useful for mouse and keyboard titles.)

We're still a long way from web browsers being equivalent to game consoles. That said, they are catching up fast. You could easily have an experience that shames the last generation, especially when WebGL 2 lands, and you don't have to worry about what happens in 10, 40, or even hundreds of years as long as society deems your art worthy for preservation. I do hope that some artists and serious developers take real advantage of it, though. Shovelware could obscure its power and confuse users, and we know they will be pretty much first out of the gate.

Source: Unity

Tencent Buys Stake in Artillery Games

Subject: General Tech | October 21, 2015 - 09:48 PM |
Tagged: webgl, tencent, atlas, artillery games

The Chinese investment and Web company, Tencent, has taken interest in many American video game companies. In a couple installments, Tencent purchased chunks of Riot Games, developer of League of Legends, which now total up to over 90% of the game studio. They later grabbed a “minority” (~48%) stake in Epic Games, which creates Unreal Engine, Unreal Tournament, Fortnite, Infinity Blade, the original three Gears of War games, and a few other franchises.

artillery-2015-logo.jpg

This time, they purchased an undisclosed share of Artillery Games. Artillery has not released a title yet, but they are working on a WebGL-powered engine. In other words, titles created with this technology will run directly in web browsers without plug-ins or extensions. At some point, Artillery Games decided to make a native client alongside their web engine, which was announced in September. This was apparently due to latency introduced in the Pointer Lock API and networking issues until WebRTC matures. (WebRTC brings P2P network sockets to web browsers. While people mentally equate it to video conferencing, it is also used for client-to-client multiplayer. There is even a BitTorrent client that runs in a web browser using it.)

Unfortunately, the real story would be how much of Artillery they have purchased, and we don't know that yet (if ever). They are buying up quite a lot of formerly-independent studios though, considering how many are left.

Graphics Developers: Help Name Next Generation OpenGL

Subject: General Tech, Graphics Cards | January 16, 2015 - 10:37 PM |
Tagged: Khronos, opengl, OpenGL ES, webgl, OpenGL Next

The Khornos Group probably wants some advice from graphics developers because they ultimately want to market to them, as the future platform's success depends on their applications. If you develop games or other software (web browsers?) then you can give your feedback. If not, then it's probably best to leave responses to its target demographic.

opengl_logo.jpg

As for the questions themselves, first and foremost they ask if you are (or were) an active software developer. From there, they ask you to score your opinion on OpenGL, OpenGL ES, and WebGL. They then ask whether you value “Open” or “GL” in the title. They then ask you whether you feel like OpenGL, OpenGL ES, and WebGL are related APIs. They ask how you learn about the Khronos APIs. Finally, they directly ask you for name suggestions and any final commentary.

Now it is time to (metaphorically) read tea leaves. The survey seems written primarily to establish whether developers consider OpenGL, OpenGL ES, and WebGL as related libraries, and to gauge their overall interest in each. If you look at the way OpenGL ES has been developing, it has slowly brought mobile graphics into a subset of desktop GPU features. It is basically an on-ramp to full OpenGL.

We expect that, like Mantle and DirectX 12, the next OpenGL initiative will be designed around efficiently loading massively parallel processors, with a little bit of fixed-function hardware for common tasks, like rasterizing triangles into fragments. The name survey might be implying that the Next Generation OpenGL Initiative is intended to be a unified platform, for high-end, mobile, and even web. Again, modern graphics APIs are based on loading massively parallel processors as directly as possible.

If you are a graphics developer, the Khronos Group is asking for your feedback via their survey.

"gorescript" Is an Indie, Browser-Based 3D Shooter

Subject: General Tech | January 16, 2015 - 02:36 PM |
Tagged: webgl, pc gaming

gorescript is a first person shooter that runs in a browser through WebGL (via Three.JS). Its developer, Time Invariant Games, has not mentioned a business model, if there is one, but the first three levels, three guns, and two monsters can be instantly played at their GitHub site for free. The source code, including a map editor for levels and a map editor for monsters, weapons, and powerups, is also on their GitHub under an MIT (permissive) license.

gorescript.jpg

From a technical standpoint, it is not the most impressive game that I have played in a browser -- that would be a toss-up between Unreal Tournament 3 (they had a demo at Mozilla Summit 2013) or Dead Trigger 2. In gorescript on Firefox 35, I was getting about 80-110 FPS, although that begun to crawl down to around 20-30 FPS with some dips down to ~10 FPS; it was probably averaging ~45 FPS when all is said and done.

(Note: Especially if you have a high-refresh panel, the maximum frame rate of Firefox can be adjusted in about:config with the layout.frame_rate variable. Mine is set to 120.)

Again, it is free and it should amuse you for a little while. Maybe we can get it to blow up with third-party content? Even as it is, I think it is worth a mention for anyone who wants a Doom/Quake throwback.

Watch Notch Make a Doom Renderer in Dart and WebGL

Subject: General Tech | August 27, 2014 - 04:54 PM |
Tagged: Notch, webgl, dart, doom

Notch, creator of Minecraft, is developing a rendering engine for Doom in Dart and WebGL (I assume as a hobby). I am a little late to the party, and he has been developing for the last couple of hours now. If you were curious about what it looks like to watch someone develop a 3D rendering engine, this could be your chance. He also interacts with the chatroom, which should be more interesting.

notch-doom-dart-webgl.png

Dart is an open-source programing language that was released by Google in 2011. It compiles to Javascript, but also can be used to make applications via a modified Chromium browser with a direct Dart virtual machine (VM). It can also be run within a command-line.

Watching people program is picking up in popularity. While you would think that this is even more boring than watching people play video games, and you might be right, it could still gain an audience. Epic Games has been working to develop Twitch streaming capabilities directly within Unreal Engine 4's editor, to allow indies (or even large developers) to interact with fans and colleagues.

If interested, check out Notch's stream at Hitbox.tv.

Source: Hitbox.tv

Acer Unveils Chromebook 13 Powered By NVIDIA Tegra K1 SoC

Subject: General Tech, Mobile | August 11, 2014 - 08:00 AM |
Tagged: webgl, tegra k1, nvidia, geforce, Chromebook, Bay Trail, acer

Today Acer unveiled a new Chromebook powered by an NVIDIA Tegra K1 processor. The aptly-named Chromebook 13 is 13-inch thin and light notebook running Google’s Chrome OS with up to 13 hours of battery life and three times the graphical performance of existing Chromebooks using Intel Bay Trail and Samsung Exynos processors.

Acer Chromebook 13 CB5-311_AcerWP_app-02.jpg

The Chromebook 13 is 18mm thick and comes in a white plastic fanless chassis that hosts a 13.3” display, full size keyboard, trackpad, and HD webcam. The Chromebook 13 will be available with a 1366x768 or 1920x1080 resolution panel depending on the particular model (more on that below).

Beyond the usual laptop fixtures, external I/O includes two USB 3.0 ports, HDMI video output, a SD card reader, and a combo headphone/mic jack. Acer has placed one USB port on the left side along with the card reader and one USB port next to the HDMI port on the rear of the laptop. Personally, I welcome the HDMI port placement as it means connecting a second display will not result in a cable invading the mousing area should i wish to use a mouse (and it’s even south paw friendly Scott!).

The Chromebook 13 looks decent from the outside, but it is the internals where the device gets really interesting. Instead of going with an Intel Bay Trail (or even Celeron/Core i3), Acer has opted to team up with NVIDIA to deliver the world’s first NVIDIA-powered Chromebook.

Specifically, the Chromebook 13 uses a NVIDIA Tegra K1 SoC, up to 4GB RAM, and up to 32GB of flash storage. The K1 offers up four A15 CPU cores clocked at 2.1GHz, and a graphics unit with 192 Kepler-based CUDA cores. Acer rates the Chromebook 13 at 11 hours with the 1080p panel or 13 hours when equipped with the 1366x768 resolution display. Even being conservative, the Chromebook 13 looks to be the new leader in Chromebook battery life (with the previous leader claiming 11 hours).

acer chromebook 13 tegra k1 quad core multitasking benchmark.jpg

A graph comparing WebGL performance between the NVIDIA Tegra K1, Intel (Bay Trail) Celeron N2830, Samsung Exynos 5800, and Samsung Exynos 5250. Results courtesy NVIDIA.

The Tegra K1 is a powerful little chip, and it is nice to see NVIDIA get a design win here. NVIDIA claims that the Tegra K1, which is rated at 326 GFLOPS of compute performance, offers up to three times the graphics performance of the Bay Trail N2830 and Exynos 5800 SoCs. Additionally, the K1 reportedly uses slightly less power and delivers higher multi-tasking performance. I’m looking forward to seeing independent reviews in this laptop formfactor and hoping that the chip lives up to its promises.

The Chromebook 13 is currently up for pre-order and will be available in September starting at $279. The Tegra K1-powered laptop will hit the United States and Europe first, with other countries to follow. Initially, the Europe roll-out will include “UK, Netherlands, Belgium, Denmark, Sweden, Finland, Norway, France, Germany, Russia, Italy, Spain, South Africa and Switzerland.”

Acer Chromebook 13 CB5-311_closed 2.jpg

Acer is offering three consumer SKUs and one education SKU that will be exclusively offering through a re-seller. Please see the chart below for the specifications and pricing.

Acer Chromebook 13 Models System Memory (RAM) Storage (flash) Display Price MSRP
CB5-311-T9B0 2GB 16GB 1920 x 1080 $299.99
CB5-311-T1UU 4GB 32GB 1920 x 1080 $379.99
CB5-311-T7NN - Base Model 2GB 16GB 1366 x 768 $279.99
Educational SKU (Reseller Only) 4GB 16GB 1366 x 768 $329.99

Intel made some waves in the Chromebook market earlier this year with the announcement of several new Intel-powered Chrome devices and the addition of conflict-free Haswell Core i3 options. It seems that it is now time for the ARM(ed) response. I’m interested to see how NVIDIA’s newest model chip stacks up to the current and upcoming Intel x86 competition in terms of graphics power and battery usage.

As far as Chromebooks go, if the performance is at the point Acer and NVIDIA claim, this one definitely looks like a decent option considering the price. I think a head-to-head between the ASUS C200 (Bay Trail N2830, 2GB RAM, 16GB eMMC, and 1366x768 display at $249.99 MSRP) and Acer Chromebook 13 would be interesting as the real differentiator (beyond aesthetics) is the underlying SoC. I do wish there was a 4GB/16GB/1080p option in the Chromebook 13 lineup though considering the big price jump to get 4GB RAM (mostly as a result of the doubling of flash) in the $379.99 model at, say, $320 MSRP.

Read more about Chromebooks at PC Perspective!

Source: Acer