I Agree, tracking a real physical mouse and keyboard 1:1 to a virtual representation of a mouse and keyboard in VR using lighthouse would be great, so that a user could really be productive with traditional 2D interfaces in VR, such as complex parts of the editor like blueprints. Why do everything with touch controllers? You could simply choose the best input device for any particular task. Tracking a physical keyboard, means that you could see and pick it up and move it around, but know precisely where to peck the keys. With a flat looking representation of a BluePrint graph hovering in front of you, you could reach for the mouse and have precision control over nodes and their connections.