I’m currently developing theoretical ideas for a lightweight game engine that uses an experimental system I call physics stamping—a technique that leverages 16-bit normal maps not just for lighting, but also to carry detailed physical metadata directly on the surface of the world.
Rather than traditional per-material physics definitions, this system allows physics attributes (like slipperiness, friction, density hints, or AI affordance cues) to be embedded into texture stamps. These stamps can be layered or tiled across a surface like decals, each with their own lighting and behavior payload.
At runtime, the engine uses repurposed ray tracing cores (or emulated ones if unavailable) to interpret these stamps for both lighting and AI perception. The idea is to allow AI agents to “see” and “feel” the environment with contextual awareness — understanding the material layout, danger zones, or walkability without excessive CPU-side logic or navmesh baking.
Features I can think of so far.
16-bit stamped normal maps that carry lighting and material context.
Real-time tagging that overlays surface behaviors for AI and physics. Think of it as a mega-texture style aproach to physics stamps.
Repurposed RT-like feedback to allow AI systems to interpret world surfaces as readable terrain.
A projected 2D stamp system that can override or blend with base terrain data.
Lightweight performance goals for use on older hardware or future simulation platforms.
This engine is designed around flexibility, physical believability, and data efficiency. I believe these kinds of hybrid systems are going to be critical as we move toward more dynamic AI-driven environments and forensic-level virtual simulations.
If anyone’s working on similar experiments or wants to collaborate, I’m happy to discuss implementation details or speculative directions.
Update: Use vectors instead of repurposed rt cores. I am not a game engine designer but i loved the tagging idea and ran with it. I’d love to see what kind of ideas a programmer could come up with.
Edit: I want to add this, seems important. I’m just trying to brainstorm ways this engine would work.
Core Features of Your Tagging System
Physics Stamping
Embeds surface properties (friction, density, slipperiness) directly into normal maps or decals.
Uses 16-bit normals to store both lighting and physical metadata.
AI Perception Mapping
Tags convey walkability, danger zones, and object affordance.
AI agents “read” the environment visually, like humans do, without navmesh dependency.
Real-Time Surface Interpretation
Repurposes RT cores (or emulated equivalents) to interpret stamped data.
Enables live analysis of material context, lighting, and physics by AI and rendering systems.
Decal-Like Stamp Overlays
Physics and metadata stamps are applied like megatextures or decals.
Supports layering or blending for dynamic terrain changes.
Lightweight Simulation
Minimizes CPU-side physics calculations.
Designed for efficiency on older hardware or mobile/future platforms.
RT + Vector Hybrid Use
Uses RT cores for lighting/reflection.
Uses vector-based pathing for AI movement, enhancing speed and realism.
Forensic & Simulation Readiness
Tagged environments enable forensic replay, training AI, and environmental simulation with physical realism.
Infinite Resolution Concept
Encourages tiled, compressible, infinite resolution through stamps and tag layering.
Think: physically intelligent MegaData system.