Summit Technology Laboratory: Dynamic Projection Mapping on Moving Surfaces with Multiple Projectors

Posted by – May 9, 2026
Category: Exclusive videos

Summit Technology Laboratory demonstrates its software for creating large-scale, immersive experiences by automatically aligning and blending multiple projectors. The system creates a single, seamless image from two or more projectors, even on non-planar or dynamically moving surfaces. This technology is designed to eliminate the complex and time-consuming manual calibration process typically required for multi-projector setups.


HDMI® Technology is the foundation for the worldwide ecosystem of HDMI-connected devices; integrated with displays, set-top boxes, laptops, audio video receivers and other product types. Because of this global usage, manufacturers, resellers, integrators and consumers must be assured that their HDMI® products work seamlessly together and deliver the best possible performance by sourcing products from licensed HDMI Adopters or authorized resellers. For HDMI Cables, consumers can look for the official HDMI® Cable Certification Labels on packaging. Innovation continues with the latest HDMI 2.2 Specification that supports higher 96Gbps bandwidth and next-gen HDMI Fixed Rate Link technology to provide optimal audio and video for a wide range of device applications. Higher resolutions and refresh rates are supported, including up to 12K@120 and 16K@60. Additionally, more high-quality options are supported, including uncompressed full chroma formats such as 8K@60/4:4:4 and 4K@240/4:4:4 at 10-bit and 12-bit color.

The core capability shown is dynamic projection mapping, where the software adapts the projected image in real time to a surface that is changing shape, such as a fluttering flag. The system uses a depth camera to track the surface geometry and continuously adjusts the projection to maintain a coherent and undistorted image. This real-time adaptation is a key differentiator, enabling interactive and dynamic visual displays that were previously not possible with multiple projectors.

For sensing, the demonstration utilizes a consumer-grade RGBD camera, specifically a Microsoft Azure Kinect, which is a time-of-flight (ToF) depth camera. The software’s performance is currently limited by the frame rate and resolution of the camera hardware. The company notes that higher-quality, faster depth cameras would enable an even more seamless experience. The processing is handled by a high-performance computer capable of keeping up with the real-time demands of the system.

The software is hardware-agnostic, compatible with any brand or model of projector, including mixed-throw types like short-throw and ultra-short-throw projectors used together. The setup process is simplified to placing the projectors and cameras to illuminate the target surface and running the software, which handles the rest of the alignment and blending automatically. This user-friendly approach makes complex projection mapping accessible without specialized user intervention.

Applications for this technology include large, immersive displays for simulations, such as the 180-degree cylindrical sea glider simulator mentioned, which uses three projectors. The technology also opens up possibilities for projection mapping onto moving objects, such as a performer’s dress on stage, creating dynamic and interactive visual effects for live events and entertainment.

source https://www.youtube.com/watch?v=-ZMFoJooYsU