SVG files are XML files that are (can be) commonly edited using existing standard tools like e.g. Inkscape.
Under the hood, there's a Nasal module in $FG_ROOT/Nasal/canvas/svg.nas that parses such XML files and maps a SUBSET of SVG/XML to the OpenVG Path syntax understood by the Canvas.Path drawing primitive (element) that is then mapped to OSG/ShivaVG drawing instructions.
For ultimate flexibility, you can obviously also use Canvas.Path based drawing directly - most people without a coding background will typically resort to WYSIWYG editors however, whereas coders tend to use APIs directly without going through the SVG/XML route.
If on the other hand, you want to be able to easily tinker with different artwork/schemes, using the SVG/XML option is pretty straightforward and you could for example tinker with different panels/layouts and instruments fairly easily, and even get artwork created/contributed by non-coders. Whereas the coding option is typically better accessible to programmers, many of whom don't exactly enjoy using tools like Inkscape etc
Using the SVG/XML option also means that you may run into unsupported SVG/XML markup that isn't currently understood by our tiny home-grown parser - then again, someone with a background in coding can fairly easily:
https://wiki.flightgear.org/Canvas_SVG_parserhttps://wiki.flightgear.org/Howto:Exten ... SVG_moduleIn addition, it's possible to use a hybrid approach - i.e. Inkscape (SVG/XML) for some of the more "static" elements whereas procedural stuff for animations and stuff that needs to be updated frequently (non-static).
Regarding the MFD framework in particular, it's well-designed and based on concepts we've been discussing for years, for instance using a simple design where MFD instruments consist of "pages" with "page elements", including support for page navigation.
While that may sound like overkill for simple stuff like an EHSI or ND, this is one of the most common use-cases among complex avionics, and something where many previous attempts/approaches simply failed.
For instance, most of the early MFD instruments based on Nasal/Canvas could not easily be used with different aircraft, different cockpits/panels or different FDMs - instead they contained a ton of hard-coded assumptions/structures.
The MFD framework can help you come up with a generic design where you can easily replace components without having to rewrite tons of stuff - especially because this framework and Emesary were written by the same developer (Richard Harrison, another core developer).
However, just to be clear about it, this isn't saying anything about the EFIS framework at all - since that is primarily about "drawing", i.e. populating stuff - whereas Richard's MFD framework is more about organizing a device into pages, page elements, knobs/buttons and mapping those to different events.
Thus, for starters you could for example try to prototype an HSI by creating a single-page MFD with different modes, using Emesary for bindings.
And then populating/animating the whole thing with drawing provided by jsb's framework.
So there's nothing mutually exclusive here, just a little undocumented unfortunately.
On the other hand, if you don't know much about OOP, JavaScript/Nasal and MVC or other design patterns, your life will be much easier ignoring the MFD/Emesary stuff, because that basically requires a strong backgound in CS/SE or someone with a corresponding background to mentor you - in other words, the SVG/XML + Inkscape option would provide more bang for the buck and not much of a learning curve in comparison.
MapStructure is a different/orthogonal piece altogether - it's basically an MVC framework to create maps/charts by using sets of files that control different aspects of such moving map displays, for instance:
- maps consist of layers
- layers consist of background layers (compass rose) and content layers (symbols like navaids)
layers can have layer controllers
and symbols can have associated symbol controllers
Under the hood you basically set up an empty map and you tell it what kind of layers you want to be shown (say VOR, NDB and TRAFFIC).
The framework will then work very much like "ruby on rails" by making some assumptions for you (which can be customized/overridden) and then sets up a new map.
That map will the use a configure position source (e.g. /position or MP/AI traffic) as the virtual center of the map.
The range of the map can the be provided by either a property or a delegate
And then there's a bunch of styling options to customize different maps/layers as needed.
This may sound a little complicated/roundabout, but this approach was chosen to ensure that the same code base and back end can be reused by different aircraft and other frontends (GUI dialogs).
Because once you think about it, there are shared/overlapping requirements when it comes to showing an in-sim map with different navaids/traffic/weather etc and showing a corresponding moving map instrument.
Typically, the cockpit instrument is controlled by virtual buttons/knobs and cockpit bindings, whereas the UI dialog is controlled by mouse and keyboard events.
And that's exactly where Emesary shines: instead of dealing with clicks/buttons directly, they're basically mapped to a message channel and turned into abstract events - that way, the same events can be triggered/processed by different front-ends, no matter if that means a cockpit with hot spots/bindings, a GUI dialog or even some telnet/browser client triggering events/actions in the sim using a corresponding telnet/phi session.
However, this sort of coding definitely is a different kind of coding (take a look at the FG1000 to see for yourself). So you will need to have a corresponding background in coding - and in fact I believe, so far the MFD/Emesary approach has only been adopted by people who were either mentored by Richard himself or by people with a professional background in computer science/engineering and/or in FlightGear core development specifically.
Finally, the end result of such an instrument is always a texture that is updated in memory (a so called RTT/FBO) - displaying that elsewhere means literally STREAMING the texture data to another process/machine (which is possible and supported already but comes with latency issues depending on your connectivity/setup).
There are other ideas/approaches, but the only other option outside the fgfs code base is definitely FGQCanvas, but that comes with its own pros & cons - for instance, it cannot currently render/support most MapStructure or FG1000 stuff and has other restrictions.
One thing that some of us contemplated a few years ago was either serializing the Canvas itself to SVG/XML format so that it could be directly interpreted by any web browser, or alternatively begin using a subset of Nasal + JavaScript (which are similar enough) so that a tiny subset of Nasal/JavaScript would be used that could be interpreted by both, fgfs and a browser.
You could also structure your SVG files to use a subset of SVG/XML that is understood by FlightGear and then load JavaScript over http (Phi) - that way the same artwork could be used with only the animations stuff having to be handled by Nasal/JavaScript respectively (at which point you will inevitably realize that it makes sense to use a common subset of both languages)
The latter is an idea that the creator/designer of the Canvas system was contemplating about for a while ...
Ultimately it all depends on your requirements, your background and time frame.