What a mess here!
I (Vincent Rivière) just want to say that I welcome the GCC brown edition. It is a different approach than the FreeMiNT one. And different visions of the same object lead to better understanding.
- FreeMiNT approach is the native one. Port GCC to a new operating system. Patch the linker to generate native executables, so GCC can naturally output executables for the current system. Then recompile everything with that: libraries, tools, etc. Of course the same can be achieved with a cross-compiler (my preferred method).
- Brown approach is the embedded one. Everything is cross-compiled to generic ELF. Then a post-processor is used to convert the resulting executable to native format. There are no libraries or tools on the target machine, because we don't care, we just want to run the resulting executable on the target machine.
During 15 years, GCC has only be used on Atari by FreeMiNT people. Even if FreeMiNT's GCC (including my own cross-tools) can easily produce TOS-only executables, I have the feeling that it has never been used by TOS people. Surprisingly, it seems that Brown GCC has gained immediate enthusiasm from TOS people. I can only welcome this situation, as this brings more people to GCC. And if this encourages people to write cleaner code, compatible with GCC and ELF features, that can only be a good thing.
Regarding to my m68k-atari-mint cross-tools
, yes, they are currently stuck to GCC 4.6. This is just because that version is good enough for my own needs, as well as FreeMiNT people. It has been extremely hard, over the years, to get something stable. Every new version needs more work, more testing, more bug reports, more risks of regression... and finally, not much benefits. And most of all, last years I had less time to care about GCC stuff, so things stayed like that. But the most important is that I have published all my work, including full history, to GitHub: m68k-atari-mint-gcc
. So people braver than me can continue the work with newer GCC, what is currently happening with Thorsten and MiKRO.
Regarding to a.out, even if it is *not* dead, it is more and more deprecated. Definitely, the biggest trouble I encountered when porting MiNT patches to newer binutils/GCC were upstream bugs related to a.out. This is like that: GNU people don't care about a.out anymore. So anyone still using a.out features will hit bugs, because that code is no more tested upstream. So definitely, using ELF intermediate files is the way to go. This is why Brown GCC is 1 step forward FreeMiNT GCC regarding to that issue.
On the other hand, I agree to Brown detractors, the Brown solution is not what native GCC users would expect. But it *is* the general embedded solution: executables are generated with any standard ELF tools (old GCC, new GCC, or anything else), then the post-processor converts that into the target format. So any compiler upgrade is uncorrelated from Atari stuff. Hence more simplicity.
Regarding to legal stuff, I didn't look precisely as what is (or is not) provided (I didn't even try Brown GCC myself). So I have no idea if there is a real issue or not. But basically, when you redistribute binaries, you must respect the upstream license, otherwise the upstream copyright holder (namely: the FSF) may complain. When you redistribute binaries of GPL software, the following principles are roughly enough:
- do not distribute binaries built from original sources mixed with any GPL-incompatible sources
- provide your sources if a user of your binaries asks for them
The above rules are pretty easy to respect.
Of course you are free to use any license of your choice for your own tools (i.e. brownout), with your own rules.
So basically, I just say: continue Brown GCC on your way, that's your project, as an alternative to FreeMiNT GCC. If it was a bad thing, it wouldn't be used.