Board index FlightGear Development

Standalone EFIS, Autopilot, and Synoptic Framework  Topic is solved

FlightGear is opensource, so you can be the developer. In the need for help on anything? We are here to help you.
Forum rules
Core development is discussed on the official FlightGear-Devel development mailing list.

Bugs can be reported in the bug tracker.

Standalone EFIS, Autopilot, and Synoptic Framework

Postby MechAero » Wed Jun 02, 2021 3:28 pm

I am new to FlightGear and its rich history of glass-cockpit efforts and related development.

What would be the cleanest way to provide a reconfigurable, 2-way glass cockpit for another (custom) simulator, perhaps leveraging some mostly pre-built solution or well-developed starting point?
The project requirements are:
  1. Display data (tapes, compass, artificial horizon attitude indicator, digital readouts, electronic gauges, indicator lights, as on PFD, MFD, ND, autopilot settings, etc.) in real time (i.e., 30 Hz, less than 0.1 second latency, let's say) using specified protocol on multiple screens
  2. Capture user interaction with buttons, knobs, multi-position switches, toggles, maps, etc. on touch screens
  3. Allow redesign and rapid prototyping of different cockpit layouts with different number, type, style, and position of instruments/widgets
Platform-independence would be valuable too, possibly running on microcomputers and mobile devices.

Below is a reference list of relevant frameworks, tools, and capabilities in FlightGear, some of which are outdated or under development as of this post. The difficulty is in selecting an approach and tools for the project at hand.
From searching this forum and the FlightGear wiki, the following are pertinent:
Other resources:

These are listed partly just for anyone else finding this later and looking for leads.
MechAero
 
Posts: 9
Joined: Fri May 28, 2021 11:46 pm
Location: Virginia, United States
Version: 2020.3

Re: Standalone EFIS, Autopilot, and Synoptic Framework

Postby Hooray » Wed Jun 02, 2021 4:34 pm

Emesary in conjunction with the MFD framework would provide a solid foundation for such an effort, and because of the underlying design, it should be fairly straightfoward to also implement the interfacing portion.

You would basically set up an MFD instance with events/actions implemented via Emesary and then use something like rleibner's or jsb's drawing API:

https://wiki.flightgear.org/Canvas_Draw
https://wiki.flightgear.org/How_to_mani ... s_elements


The MapStructure stuff is primarily useful for mapping/charting purposes, i.e. stuff like NDs or moving map displays (see the FG1000 for the reference implementation of an MFD using Emesary and MapStructure).

For rapid prototyping purposes it would make sense to base the whole thing off SVG files that are then animated via properties and Nasal callbacks - that can be accomplished by extending svg.nas to support "nasal" and FlightGear properties (again, this is possible in Nasal space).

You could also add custom SVG primitives that way, e.g. for stuff like a compass rose, ADI, ADF, HSI etc - which would then leverage the corresponding Nasal routines to draw such elements procedurally (the shuttle has code to procedurally draw/update and animate avionics as needed).

For your interfacing and I/O needs, you might be interested in reading up on Erik's latest DDS work: https://wiki.flightgear.org/Data_Distri ... es_support

Basically, you could set up "DDS channels" that would then transmit certain categories of data (say /position, /orientation, /fdm, /autopilot, /route-manager etc).

PS: FGCanvas is not actually a thing these days, FGQCanvas is, but comes with stability and compatibility issues - for FGCanvas, the underlying idea was/is to start a subset of fgfs only, e.g. only the canvas and related subsystems (properties, events, nasal, xml) and then run this subset of FlightGear to display data provided by another fgfs instance. We had this working once, basically by patching fg_init.cxx to make unnecessary stuff optional - but these days, it seems more common to use Phi (DHTML based) or FGQCanvas. If in doubt, check out the developer's mailing list and/or reach out to Stuart (core developer and developer of the FG1000)
Please don't send support requests by PM, instead post your questions on the forum so that all users can contribute and benefit
Thanks & all the best,
Hooray
Help write next month's newsletter !
pui2canvas | MapStructure | Canvas Development | Programming resources
Hooray
 
Posts: 12707
Joined: Tue Mar 25, 2008 9:40 am
Pronouns: THOU

Re: Standalone EFIS, Autopilot, and Synoptic Framework

Postby MechAero » Wed Jun 02, 2021 5:52 pm

Thank you, Hooray. Curious: why do you recommend the MFD framework over the EFIS framework?
Would such a system be displayed using Phi, then?
Are you describing two separate drawing approaches (drawing API vs. SVG files) before and after your comment on MapStructure, or did you mean use both?
MechAero
 
Posts: 9
Joined: Fri May 28, 2021 11:46 pm
Location: Virginia, United States
Version: 2020.3

Re: Standalone EFIS, Autopilot, and Synoptic Framework  

Postby Hooray » Wed Jun 02, 2021 8:09 pm

SVG files are XML files that are (can be) commonly edited using existing standard tools like e.g. Inkscape.
Under the hood, there's a Nasal module in $FG_ROOT/Nasal/canvas/svg.nas that parses such XML files and maps a SUBSET of SVG/XML to the OpenVG Path syntax understood by the Canvas.Path drawing primitive (element) that is then mapped to OSG/ShivaVG drawing instructions.

For ultimate flexibility, you can obviously also use Canvas.Path based drawing directly - most people without a coding background will typically resort to WYSIWYG editors however, whereas coders tend to use APIs directly without going through the SVG/XML route.

If on the other hand, you want to be able to easily tinker with different artwork/schemes, using the SVG/XML option is pretty straightforward and you could for example tinker with different panels/layouts and instruments fairly easily, and even get artwork created/contributed by non-coders. Whereas the coding option is typically better accessible to programmers, many of whom don't exactly enjoy using tools like Inkscape etc

Using the SVG/XML option also means that you may run into unsupported SVG/XML markup that isn't currently understood by our tiny home-grown parser - then again, someone with a background in coding can fairly easily:

https://wiki.flightgear.org/Canvas_SVG_parser
https://wiki.flightgear.org/Howto:Exten ... SVG_module

In addition, it's possible to use a hybrid approach - i.e. Inkscape (SVG/XML) for some of the more "static" elements whereas procedural stuff for animations and stuff that needs to be updated frequently (non-static).


Regarding the MFD framework in particular, it's well-designed and based on concepts we've been discussing for years, for instance using a simple design where MFD instruments consist of "pages" with "page elements", including support for page navigation.
While that may sound like overkill for simple stuff like an EHSI or ND, this is one of the most common use-cases among complex avionics, and something where many previous attempts/approaches simply failed.

For instance, most of the early MFD instruments based on Nasal/Canvas could not easily be used with different aircraft, different cockpits/panels or different FDMs - instead they contained a ton of hard-coded assumptions/structures.
The MFD framework can help you come up with a generic design where you can easily replace components without having to rewrite tons of stuff - especially because this framework and Emesary were written by the same developer (Richard Harrison, another core developer).

However, just to be clear about it, this isn't saying anything about the EFIS framework at all - since that is primarily about "drawing", i.e. populating stuff - whereas Richard's MFD framework is more about organizing a device into pages, page elements, knobs/buttons and mapping those to different events.

Thus, for starters you could for example try to prototype an HSI by creating a single-page MFD with different modes, using Emesary for bindings.
And then populating/animating the whole thing with drawing provided by jsb's framework.

So there's nothing mutually exclusive here, just a little undocumented unfortunately.

On the other hand, if you don't know much about OOP, JavaScript/Nasal and MVC or other design patterns, your life will be much easier ignoring the MFD/Emesary stuff, because that basically requires a strong backgound in CS/SE or someone with a corresponding background to mentor you - in other words, the SVG/XML + Inkscape option would provide more bang for the buck and not much of a learning curve in comparison.


MapStructure is a different/orthogonal piece altogether - it's basically an MVC framework to create maps/charts by using sets of files that control different aspects of such moving map displays, for instance:

- maps consist of layers
- layers consist of background layers (compass rose) and content layers (symbols like navaids)

layers can have layer controllers
and symbols can have associated symbol controllers

Under the hood you basically set up an empty map and you tell it what kind of layers you want to be shown (say VOR, NDB and TRAFFIC).

The framework will then work very much like "ruby on rails" by making some assumptions for you (which can be customized/overridden) and then sets up a new map.

That map will the use a configure position source (e.g. /position or MP/AI traffic) as the virtual center of the map.
The range of the map can the be provided by either a property or a delegate

And then there's a bunch of styling options to customize different maps/layers as needed.

This may sound a little complicated/roundabout, but this approach was chosen to ensure that the same code base and back end can be reused by different aircraft and other frontends (GUI dialogs).

Because once you think about it, there are shared/overlapping requirements when it comes to showing an in-sim map with different navaids/traffic/weather etc and showing a corresponding moving map instrument.
Typically, the cockpit instrument is controlled by virtual buttons/knobs and cockpit bindings, whereas the UI dialog is controlled by mouse and keyboard events.

And that's exactly where Emesary shines: instead of dealing with clicks/buttons directly, they're basically mapped to a message channel and turned into abstract events - that way, the same events can be triggered/processed by different front-ends, no matter if that means a cockpit with hot spots/bindings, a GUI dialog or even some telnet/browser client triggering events/actions in the sim using a corresponding telnet/phi session.

However, this sort of coding definitely is a different kind of coding (take a look at the FG1000 to see for yourself). So you will need to have a corresponding background in coding - and in fact I believe, so far the MFD/Emesary approach has only been adopted by people who were either mentored by Richard himself or by people with a professional background in computer science/engineering and/or in FlightGear core development specifically.

Finally, the end result of such an instrument is always a texture that is updated in memory (a so called RTT/FBO) - displaying that elsewhere means literally STREAMING the texture data to another process/machine (which is possible and supported already but comes with latency issues depending on your connectivity/setup).

There are other ideas/approaches, but the only other option outside the fgfs code base is definitely FGQCanvas, but that comes with its own pros & cons - for instance, it cannot currently render/support most MapStructure or FG1000 stuff and has other restrictions.

One thing that some of us contemplated a few years ago was either serializing the Canvas itself to SVG/XML format so that it could be directly interpreted by any web browser, or alternatively begin using a subset of Nasal + JavaScript (which are similar enough) so that a tiny subset of Nasal/JavaScript would be used that could be interpreted by both, fgfs and a browser.

You could also structure your SVG files to use a subset of SVG/XML that is understood by FlightGear and then load JavaScript over http (Phi) - that way the same artwork could be used with only the animations stuff having to be handled by Nasal/JavaScript respectively (at which point you will inevitably realize that it makes sense to use a common subset of both languages)

The latter is an idea that the creator/designer of the Canvas system was contemplating about for a while ...

Ultimately it all depends on your requirements, your background and time frame.
Please don't send support requests by PM, instead post your questions on the forum so that all users can contribute and benefit
Thanks & all the best,
Hooray
Help write next month's newsletter !
pui2canvas | MapStructure | Canvas Development | Programming resources
Hooray
 
Posts: 12707
Joined: Tue Mar 25, 2008 9:40 am
Pronouns: THOU

Re: Standalone EFIS, Autopilot, and Synoptic Framework

Postby MechAero » Wed Jun 02, 2021 9:17 pm

That answered my question about MFD framework vs. EFSI framework. Your rich responses like this could probably be edited and augment/replace much of the wiki.
I do admire the MapStructure idea to unify the backend for the in-game and GUI maps, but I need to study the MVC pattern a bit more. I have used the factory method and recursion, as indicators of my (somewhat limited) CS expertise. I am willing to try a "clean" approach, even if it's less "bang for the buck" initially. The goal here is a long-term, maintainable, extensible solution, not a quick-and-dirty one (which is the current approach) using tediously hand-drawn, animated images.
MechAero
 
Posts: 9
Joined: Fri May 28, 2021 11:46 pm
Location: Virginia, United States
Version: 2020.3

Re: Standalone EFIS, Autopilot, and Synoptic Framework

Postby MechAero » Wed Jun 02, 2021 10:05 pm

Anyone else have any input?
MechAero
 
Posts: 9
Joined: Fri May 28, 2021 11:46 pm
Location: Virginia, United States
Version: 2020.3

Re: Standalone EFIS, Autopilot, and Synoptic Framework

Postby Hooray » Thu Jun 03, 2021 6:05 pm

MechAero wrote in Wed Jun 02, 2021 9:17 pm:That answered my question about MFD framework vs. EFSI framework. Your rich responses like this could probably be edited and augment/replace much of the wiki.


I was going to suggest that you start documenting your project using our wiki - a number of people have documented their journeys that way, for example see:

https://wiki.flightgear.org/Raspberry_OS_setup
https://wiki.flightgear.org/Howto:Using ... erator_(IG)

Obviously, free free to add/edit/rewrite contents you deem useful to your article, including my postings. :wink:

I do admire the MapStructure idea to unify the backend for the in-game and GUI maps, but I need to study the MVC pattern a bit more. I have used the factory method and recursion, as indicators of my (somewhat limited) CS expertise. I am willing to try a "clean" approach, even if it's less "bang for the buck" initially. The goal here is a long-term, maintainable, extensible solution, not a quick-and-dirty one (which is the current approach) using tediously hand-drawn, animated images.


MVC isn't rocket science - it stands for Model/View/Controller.
Basically, we're using a decouple approach to encapsulate 3 different things:

Model = WHAT IS TO BE RENDERER/SHOWN (e.g. navaids, position of a navaid and type of navaid/symbol)
View = HOW IT IS TO BE RENDERED (actual drawing routines or images/file names, transformations/animations)
Controller = HOW TO CONTROL THE MAP/LAYER (mouse, keyboard, cockpit bindings)

This is why this type of coding may seem a bit redundant or roundabout to people unfamiliar with such abstraction patterns - anything that is supposed to be customizable is typically passed via a hash with default key/value pairs, which can then be overridden by any calling code - for instance to add custom callbacks and/or classes.
Please don't send support requests by PM, instead post your questions on the forum so that all users can contribute and benefit
Thanks & all the best,
Hooray
Help write next month's newsletter !
pui2canvas | MapStructure | Canvas Development | Programming resources
Hooray
 
Posts: 12707
Joined: Tue Mar 25, 2008 9:40 am
Pronouns: THOU

Re: Standalone EFIS, Autopilot, and Synoptic Framework

Postby Johan G » Thu Jun 03, 2021 7:05 pm

Hooray wrote in Thu Jun 03, 2021 6:05 pm:MVC isn't rocket science - it stands for Model/View/Controller.
Basically, we're using a decouple approach to encapsulate 3 different things:

Model = WHAT IS TO BE RENDERER/SHOWN (e.g. navaids, position of a navaid and type of navaid/symbol)
View = HOW IT IS TO BE RENDERED (actual drawing routines or images/file names, transformations/animations)
Controller = HOW TO CONTROL THE MAP/LAYER (mouse, keyboard, cockpit bindings)

This encapsulation will help both you and others later. Sometimes in ways that could not have been predicted.
Low-level flying — It's all fun and games till someone looses an engine. (Paraphrased from a YouTube video)
Improving the Dassault Mirage F1 (Wiki, Forum, GitLab. Work in slow progress)
Some YouTube videos
Johan G
Moderator
 
Posts: 6625
Joined: Fri Aug 06, 2010 6:33 pm
Location: Sweden
Callsign: SE-JG
IRC name: Johan_G
Version: 2020.3.4
OS: Windows 10, 64 bit

Re: Standalone EFIS, Autopilot, and Synoptic Framework

Postby Hooray » Thu Jun 03, 2021 7:41 pm

It is however worth noting that people without a coding background will rarely appreciate/recognize the additional complexity.
Besides, we're using hashes, functions (delegates) and classes/objects for the MVC parts of MapStructure.
However, Emesary would provide a much better/stronger separation here by using "channels", "transmitter" and "receivers" that deal with different message types and messages.

Basically, receivers can subscribe to a channel/event and then process those events.

That would be another option to implement a MVC-oriented scheme where you can isolate different components, without that necessarily following MVC in the conventional sense.
Please don't send support requests by PM, instead post your questions on the forum so that all users can contribute and benefit
Thanks & all the best,
Hooray
Help write next month's newsletter !
pui2canvas | MapStructure | Canvas Development | Programming resources
Hooray
 
Posts: 12707
Joined: Tue Mar 25, 2008 9:40 am
Pronouns: THOU

Re: Standalone EFIS, Autopilot, and Synoptic Framework

Postby stuart » Fri Jun 04, 2021 8:11 pm

Hi MechAero,

I agree with Hooray that using Emesary with the MFD framework would likely be the best way to achieve your goals. When developing the FG1000, this in combination with the MVC model really helped structuring what I was coding.

-Stuart
G-MWLX
User avatar
stuart
Moderator
 
Posts: 1629
Joined: Wed Nov 29, 2006 10:56 am
Location: Edinburgh
Callsign: G-MWLX

Re: Standalone EFIS, Autopilot, and Synoptic Framework

Postby MechAero » Fri Jun 04, 2021 10:47 pm

Thank you, JohanG, Hooray, and stuart,

There seems to be a consensus. That is encouraging.

I am investigating the possibility of using the J661 project apart from FlightGear, after input from would-be users that a graphical GUI-building environment would be preferred to a Nasal scripting one, but I really appreciate the responses and would welcome any more.

Sincerely,
MechAero
 
Posts: 9
Joined: Fri May 28, 2021 11:46 pm
Location: Virginia, United States
Version: 2020.3

Re: Standalone EFIS, Autopilot, and Synoptic Framework

Postby Hooray » Sat Jun 05, 2021 10:08 pm

J661 output files are not currently supported natively by FlightGear, that would require a dedicated parser to read up the A661/DF markup and map that to the corresponding canvas primitives. Non-professional users are likely to the a conventional SVG editor much more appealing in comparison.
On the other hand, it would also be possible to set up such an A661 parser by mapping pre-created SVG files/artwork to different A661 elements/widgets.

Our de-facto reference implementation of an SVG/XML-to-Canvas parser is the svg.nas module in $FG_ROOT/Nasal/canvas
You could open it to take a look and determine if you find it sufficiently accessible in order to use it as the foundation for a new parser that maps XML tags to SVG/Canvas instructions.

Basically, without a CS background and if you have never parsed/processed XML previously, that option is probably not too appealing either.

It is however possible to code up an GUI editor in Canvas space, too - basically, you'd treat a specific directory as the source for your widget templates (defined in SVG space) and then load one file per widget - and at that point, you would treat configurable parameters as dynamic by supporting mouse events (think width/height, position etc)

It should be much less work to come up with such a scheme in Nasal/Canvas space rather than writing a full fledged A661 parser (not meaning to discourage you, but A661 files are not currently supported by fgfs directly, even though that would certainly be a good project and make FlightGear increasingly relevant to professional users/projects.

PS: Depending on your use-case and requirements it might be worth exploring how to hook up Emesary to some socket based IPC, that way you could replicate MFDs across multiple fgfs instances, without having to stream the actual texture (rather the corresponding low-level primitives would be "streamed" and updated on demand)
Please don't send support requests by PM, instead post your questions on the forum so that all users can contribute and benefit
Thanks & all the best,
Hooray
Help write next month's newsletter !
pui2canvas | MapStructure | Canvas Development | Programming resources
Hooray
 
Posts: 12707
Joined: Tue Mar 25, 2008 9:40 am
Pronouns: THOU

Re: Standalone EFIS, Autopilot, and Synoptic Framework

Postby MechAero » Thu Jun 24, 2021 9:51 pm

Thank you for those suggestions, Hooray,

I probably should been more clear in my previous post that I have decided to not use FlightGear, at least for now.

The ease of reconfigurability/design with the ARINC 661 definition files, coupled with J661's prebuilt WYSIWYG editor program (with built-in script-ability), makes J661 attractive over integrating existing FlightGear frameworks/components, possibly stripping down FlightGear to just run instruments, as discussed on this thread: https://forum.flightgear.org/viewtopic.php?f=24&t=39333.

Things might be different if we just needed panels for a FlightGear flight dynamics model, but since the simulator itself is unrelated to FlightGear, it seemed unnecessary to try to build a reconfigurable panel solution in the FlightGear environment.

However, likely there are ideas in the FlightGear codebase, such as the Emesary framework, worth studying. Just the other day I took inspiration from canvas.draw (https://wiki.flightgear.org/Canvas_draw_library) to implement an equivalent to marksLinear() for J661 editor scripting while scripting for a PFD.

Thanks again for all your help.

Sincerely,

Samuel.
MechAero
 
Posts: 9
Joined: Fri May 28, 2021 11:46 pm
Location: Virginia, United States
Version: 2020.3


Return to Development

Who is online

Users browsing this forum: No registered users and 5 guests