Web-based visualization for time-dynamic car simulation in a realistic 3D environment created from heterogeneous data.
Take advantage of pre-built content pipelines and datasets for point clouds, photogrammetry models, terrain, and vector data to build the 3D environment for your decision making and localization algorithms. Embed semantics for classification rendering.
Use GPU-accelerated analytics to dynamically create and test visibility coverage of sensor configurations for LIDAR, cameras, radar, and sensor fusion.
Simulate visibility among multiple dynamic cars and the 3D environment.
Use analytical shading to show distances and angles among cars.
HD maps with heterogeneous data fusion
Build and stream high definition maps from multiple heterogeneous sensors and static data sources using multi-core and IO-efficient data tiling pipelines.
Use our API to integrate your own sensor hardware and machine learning algorithms.
Fuse autonomous vehicle data with data from drones, satellites, and open data sources.
Fuse local datasets into a global map with centimeter rendering accuracy literally designed for rocket scientists.
3D in-dash visualization
Stream and render 3D maps for navigation and infotainment on any device that runs WebGL.
Combine real-time data and photogrammetry models with semantics to provide riders a unique experience.
Build on the extensible open-source CesiumJS API used across many verticals from A&D to BIM to agriculture to entertainment.
Tap into the large CesiumJS developer community and support.
Deploy to the web on multiple platforms and devices.