Announcement

Collapse
No announcement yet.

Raycasting a 2d value in the world.C++

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

    Raycasting a 2d value in the world.C++

    Hello Comunity,

    I am trying to build a c++ Plugin for the Pupil Eyetracking headset. ( https://github.com/SysOverdrive/UPupilLabsVR )

    So far I have the relevant data and I am trying to raytrace in a simple VR game the values I receive from the device.

    Input data:
    I get the x,y positions of the eyes in a 2d plane where (0,0) is the down left corner of a square and (1,1) top right corner. (Normal Floats)
    I get multiple values such as (0.5,0.5) center of the field of view or (0.3443,0.34323) left side down etc.
    Output
    The position of the projected point in the world and a corresponding ray.

    Basically I have looked over the mousetoworld examples and other raycasting exemples and could not find any suitable code/linear transformations to be helpfull.

    I did try the following by putting togheter some tutorials
    x = this->ReceivedGazeStructure->base_data.pupil.norm_pos.x;
    y = this->ReceivedGazeStructure->base_data.pupil.norm_pos.y;
    const FVector StartTrace = CameraLocation;
    const FVector ForwardVector = CameraRotator.Vector();

    FVector *PlanePoint = new FVector(x, y, 1);
    FVector *EndPlanePointTrace = new FVector(); //This should be my endpoint

    *EndPlanePointTrace = ((ForwardVector *220.f) + *PlanePoint)

    Unfortunatly I did not have any luck.

    Does anyone have any idea how to implement this.

    P.S.: This is already implemented in Unity (the guys who build the device have an open source plugin for their device in Unity )
    https://github.com/pupil-labs/hmd-ey...Calibration.cs
    line 63. I am basically trying to do something simillar to ViewportPointToRay

    Kudos in advance ^^.


    #2
    The method you are looking for is DeprojectScreenPositionToWorld located in the APlayerController class. http://api.unrealengine.com/INT/API/...rld/index.html

    I takes ScreenX and ScreenY as an input so your positions will need to be translated into the screen space (which I belive ranges from [0; 0] to (Width, Height)).
    You can get the game viewport size like this:

    Code:
    const FVector2D ViewportSize = FVector2D(GEngine->GameViewport->Viewport->GetSizeXY());
    So if I correctly understood how you store your data then you should be able to acomplish what you need like this:

    Code:
    if (GEngine->GameViewport && GEngine->GameViewport->Viewport)
    {
        const FVector2D ViewportSize = FVector2D(GEngine->GameViewport->Viewport->GetSizeXY());  
        FVector WorldLocation;
        FVector WorldDirection;
    
        // You will need to obtain a reference to your player controller
        if (PlayerController->DeprojectScreenPositionToWorld(ViewportSize.x * ReceivedGazeStructure->base_data.pupil.norm_pos.x, ViewportSize.y * (1.0f - ReceivedGazeStructure->base_data.pupil.norm_pos.y), WorldLocation, WorlDirection))
        {
            const float TraceDistance = 100.0f; // Your desired trace distance (in UU - centimiters)
    
            // Your trace start and end
            FVector TraceStart = WorldLocation;
            FVector TraceEnd = TraceStart + WorldDirection * TraceDistance;
        }
    }
    Last edited by Ogniok; 05-16-2018, 03:02 AM.
    Gameplay programmer @ Techland by day
    Programmer @ Fireline Games at night

    Comment


      #3
      Hi Ogniok and thank you,

      Your solution has proven helpfull.

      I still do not understand why I always receive an Unreal random exception on the line with DeprojectScreenPositionToWorld. ( UE4Editor.exe has triggered a breakpoint. )
      After I press continue it does not bother me anymore.

      Kudos

      Comment


        #4
        ^ here's a snippet of my ray trace code. It traces into the world from the current mouse position so you can click on world objects.

        You might be getting an exception if you aren't normalizing the camera direction vector?

        Code:
         // cannot ray trace without player controller
         if (_playerController == nullptr)
         {
          return;
         }
        
         // get mouse position
         float mouseX;
         float mouseY;
         _playerController->GetMousePosition(mouseX, mouseY);
        
         // get current camera location, rotation, direction
         FVector cameraLocation = _playerController->PlayerCameraManager->GetCameraLocation();
         FRotator cameraRotation = _playerController->PlayerCameraManager->GetCameraRotation();
         FVector cameraDirection = cameraRotation.Vector().GetSafeNormal();
        
         // trace start location is the mouse cursor in world coordinates
         // trace end location is a ray in the direction of the camera
         FVector traceStartLocation;
         FVector traceEndLocation;
         _playerController->DeprojectScreenPositionToWorld(mouseX, mouseY, traceStartLocation, cameraDirection);
         traceEndLocation = traceStartLocation + MAXIMUM_INTERACTION_DISTANCE * cameraDirection;
        Last edited by Jocko Jonson; 05-16-2018, 09:33 PM.

        Comment


          #5
          Now I want to do something similar but instead of doing a raycast I want to spawn a sphere in the world having as input X and Y on a 2D plane :

          XGaze += Radius * (float)FMath::Cos( 2 * PI * (float)(CurrentCalibrationPoint - 1) / (CalibrationType2DPointsNumber - 1) );
          YGaze+= Radius * (float)FMath::Sin( 2 * PI * (float)(CurrentCalibrationPoint - 1) / (CalibrationType2DPointsNumber - 1) + Offset);


          I have tryied something like this:

          if ( UPupilPlayerController->DeprojectScreenPositionToWorld(ViewportSize.X * XGaze , ViewportSize.Y * (1 - YGaze), WorldLocation, WorldDirection))
          {
          FRotator CameraRotation = Camera->GetComponentRotation();

          const float TraceDistance = 200.0f; // Your desired trace distance (in UU - centimiters)
          TraceStart = WorldLocation;
          TraceEnd = WorldDirection * TraceDistance; //TODO FIGURE THIS ONE OUT
          SphereComponent->SetRelativeRotation(CameraRotation);
          SphereComponent->SetRelativeLocation(TraceEnd);
          DrawDebugLine(GetWorld(), TraceStart, TraceEnd, FColor(238, 0, 238), false);

          The problem is that as I move the VR Headset the sphere spawning is still dependent on the VR Orientation. Even though the sphereComponent is attached to the Camera.

          How can I make it so I can always spawn a sphere in the middle of my field of view with
          float XGaze = 0.5;
          float YGaze = 0.5;
          ?!?

          I'm sure I have to do something with the camera/controller/vr something rotation....

          P.S. If I manage to not move my head away from the initial camera positioning the spheres spawns right in the center. If I move to the right the sphere spawns a little bit more to the right and if I move my head to the left the sphere spawns a little bit to the left.
          Last edited by SysOverdrive; 05-29-2018, 06:37 PM.

          Comment


            #6
            Try calculating TraceEnd like this:
            Code:
            TraceEnd = WorldLocation + WorldDirection * TraceDistance;
            Gameplay programmer @ Techland by day
            Programmer @ Fireline Games at night

            Comment

            Working...
            X