Hello everybody,
it is my first time writing in the forum, but i have explored a lot of the functionalities that Flightgear has to offer.
Recently i have purchased a VR headset (HTC Vive) and started to study how to develop simple apps for this device by using both openSceneGraph and OpenGL frameworks, and
by reading the forum i understood that virtual reality approach has been attempted some time ago with oculus rift, but nothing has followed (at least officially) until then.
I then took a dive in the source code and found that a lot of the render side of the software is based on openSceneGraph, so i had the idea of trying to perform an implementation of the source code and add the support for virtual reality through the openVR api.
I am aware that it is not possible (at least for now) to have more cameras active at the same time in the scenery due to the lack of a composite viewer, but this problem could virtually be overcome by the presence of apps like steamVR that allow mirroring of the screen.
The first step i want to take is to render directly the camera inside the headset, but i don't clearly understand where in the source code the final frame buffer object to be shown is created. Is there anyone who can put me in the right direction?
Thank you in advance for your attention.
Edit: i had To specify that i am interested to develop this in Linux.