Last week, filmmakers, animators, and visualization artists converged on Los Angeles—the City of Dreams, as some call it—to learn and share tips and tricks at SIGGRAPH, the annual computer graphics show.
Here, attendees come to play with pixels in virtual reality and augmented reality (AR-VR); and find new ways to bring digital models to life using real-world physics.
NVIDIA Moon Landing
The latest footage of the moon landing released by NVIDIA is a celebration of the 50th anniversary of the event. Made possible with RTX GPUs, the rendered scene achieved what wasn’t possible at the time with camera technology–a highly detailed virtual scene showing the moon’s craters, the landing craft’s mechanics, and the changing lunar ambiance, all perfectly reflected in the astronaut’s suits and helmet.
RTX technology includes, among other things, deep learning. The feature is responsible for speeding up rendering in CAD programs like SOLIDWORKS, with the introduction of AI-powered denoising.
At SIGGRAPH, attendees also get the chance to experience the moon landing themselves. Partly cinematic magic, partly AI-driven real-time rendering, NVIDIA shows it’s possible to instantly translate a volunteer’s physical pose into an astronaut’s figure and render it as part of the moon landing scene.
In doing so, NVIDIA shows, for some application, it can use AI-powered joint detection to produce the type of realistic movements previously only possible with motion capture.
AMD Showcases AI-Driven Denoising
At the show, NVIDIA’s rival AMD also demonstrated its own denoising feature in ProRender. Distributed as a free renderer, ProRender is AMD’s countermeasure against some rendering programs that work only with NVIDIA GPUs.
ProRender is available as plug-ins for leading CAD packages, such as Modo, Autodesk Maya, Autodesk Max, Blender, and SOLIDWORKS, among others.
Denoising involves the use of machine learning to guess where light rays will end up, so the guess work speeds up the final rendered image. Most rendering programs with denoising offer the option to turn the feature on or off, depending on the level of accuracy desired.
KeyShot Starts Supporting GPU
For a long time, KeyShot renderer remains strictly for the CPU, but at this SIGGRAPH, attendees found out this was about to change.
According to Luxion, the developer of KeyShot, NVIDIA RTX’s real-time rendering capacity is the primary reason to start supporting GPU. As a result, KeyShot will also offer AI-driven denoising, a feature of RTX.
KeyShot 9, expected to be available in fall 2019, is the first version of the renderer with GPU support. Depending on the type of scene and the CPU cores available in your own workstation, you will be able to run KeyShot 9 on CPU or GPU. If you run KeyShot in the CPU mode, the Intel Open Image Denoising will kick in, said Luxion.
Currently KeyShot only supports NVIDIA RTX GPUs.
SME Additive Manufacturing Competition
SME, the society for manufacturing engineers, and its partner Stratasys, a 3D printer maker, are teaming up for the annual SkillUSA Additive Manufacturing Competition. Now in its 5th year, the event aims to inspire and equip the next generation of engineers, currently still learning the craft, with the skills to design and create products that can be manufactured using additive manufacturing.
The winning teams will receive scholarships from SME, along with free access to classes and conference passes, SOLIDWORKS CAD software, and a Makerbot mini printer.