Probably the best way is NOT integrating Oculus Rift in FlightGear but in OpenSceneGraph, as the other 3D modes such as red/green glasses or stereographic mode do. In fact, there is already a plugin in OpenSceneGraph for the Oculus:
https://github.com/bjornblissing/osgoculusviewer The only thing you have to change in FlightGear is activating a flag when the user requests the Oculus mode in exactly the same way the other 3D modes work.
Another way, probably a bit more complex, is defining two windows such as in a multiscreen configuration (
http://wiki.flightgear.org/Howto:Config ... ew_windows) , each window with cameras slightly separated, and then apply a shader to each window to calculate the barrel distortion:
https://github.com/dghost/glslRiftDistort You can test this proposal right now, by activating the "stereographic 3D mode" in FlightGear menu. Since this current "stereographic 3D" mode does not apply the barrel transform the Oculus lenses need, the view near the edges of the screen is going to be heavily distorted, but you'll get the idea of how it works.
Regarding the tracking, I was successful integrating tracking in exactly the same way the other tracking methods work, by doing it in a separate application and connecting to FlightGear using telnet:
http://wiki.flightgear.org/Head_trackingKeep in mind this is just my opinion about how I tried (but failed) to integrate the Oculus, until they dropped the support for Linux.