This is actually 3 tests in one, Threading, UDP Socket networking and sensor data handling.
The Android app
So what happens is the Android phone app gathers sensor data and calculates orientation in yaw pitch and roll sends this rotation data over via wifi to UE4 running on a PC , the Ue4 receives the data and applies the rotation to the bar shaped phone mesh in real time reflecting the orientation of the phone in real world.
Apart from the jitters in high frequency ranges (cuz no LPF) it seems to work pretty nice overall and could be a good low cost substitution to have very basic motion control in PC games. Needs work to make it constructively use able though, can also serve to get other sensor data like GPS , Magnetometer , light , proximity etc.
A little more detailed description:
The Android app gathers sensor readings for Accelerometer and Magnetometer using android java api, then it processes these readings to get a orientation in 3d space with phone as origin of its co-ordinate space.
Next these roll pitch and yaw values are mapped on to 0-65535 to get 16 bit network transmittable data, these 16x3 bits are then sent over udp from the app to the pc.
The Ue4 game has a socket waiting on a dedicated thread for network data (See socket programming). when the data is received the 16 bits are mapped to 0-360 and stored in a rotator variable, the game thread reads this rotator each tick and applies it to the bar shaped mesh in the game world to get the phone’s rotation reflected.