What went wrong with GPC? ---------------------------- Well, of course this is potentially very biassed, because I'm in the FPC camp, but still I want to add some points to the discussion what went wrong, and why it went wrong. Note that even other FPCers might (and actually do) disagree. The point is more the line of reasoning, and naming potential aspects to watch out for in FPC and other projects, it is not about giving some retrospect verdict on GPC. Note also that the original concept of the document was 2006-2007, thought it was updated later. Obviously, the low number of developers (and committed users) is the main problem of GPC, since as an older public project, it has had more time (7-8 years, enough for a FPC major version cycle, or more) than FPC. (to research: when did GPC actually go public? I only see dates like 94-95 on the contributors page for other people after Jukka) But a low number of users must have been the case with FPC in the early nineties too at some point. So one can rephrase that to "why didn't GPC grow like FPC did?" Then answering the question becomes a lot easier: 1. Standarised Pascals were the less popular dialect choice. While this is not an absolute problem, the long reluctance to add a Borland (TP and Delphi) compatibility mode, and the failure to quickly make that decent (*) probably was more disastrous than (also) the desire to (fully) support the standards. To my knowledge GPC still requires rewriting of TP shortstring code to schemata to this day, though it claims "TP compability as much as reasonably possible for a 32/64-bit compiler" (**) (*) : Loosely defining decent as "without rewrites". While often quite simple to do, having to do modifications is a barrier to entry, and requires source dual-maintenance if you need other compilers. Few users have already burned all bridges when they start using a new compiler. It is the users with the larger codebases that are the big bugreporters, and most likely to enter development. (**): In the 1st half of 2008 I verified with Waldek in c.l.p.m that this was still the case) 2. Choice for GCC. While GCC has the basic advantage that have a framework that you "only" need to modify for the language, there are some problems with it: (the points that I feel are shared by GPC developers are marked with *) - your project becomes effectively multi language (C and Pascal), needing double language skills for compiler/rtl developers, but also for your better users that want to provide some context with their bugreports. * you need to track gcc versionwise, at least at a distance. Major restructures in gcc potentially mean lots of work and instability, and most of the benefits these provide must be actively added to the port and get validated also. Major new gcc features are said to be free, but are more often than not accompanied with large fundamental restructures. - The build process becomes convoluted (lots of additional buildtools, OS interaction via complex libraries), unix centric build system with often only half-ported tools on non unix (windows), but also older commercial unices sometimes are painful. - All this means that the number of independently versioned external tools and code increases, and its platform specific behaviours and its versioning complicates support. * Using a non OOP language (C) as compiler language makes working with trees more difficult and errorprone. Yes, there is fiddling with macros, but ideal is different (outdated, GCC is now rewriting to C++). - Few binary releases, and user can't build/install releases as easily, specially on non *nix. - majority (Microsoft) platforms are relatively badly supported, and the compiler's usage and building principles are alien to it. (e.g. is mingw COM compatible now?). Win64 support was awfully late. - One tends to get caught up in using workarounds that turn out to be problematic later (ISO procvars using trampolines instead of simply having a double pointer procvar type), because one pointer procvars "fit" with the framework and it is costly to work against it. (I assume there are some historic reasons in this remark though) Keep in mind that till the late nineties *nix was something for the happy few. In comparison, FPC used Turbo Pascal, the most popular compiler of that decade, with a productive (debuggable) environment. This facilitated a large project like a production compiler to a great extend. When TP went into decline, FPC was already mostly selfhosting (*) and adding Delphi extensions. Also one must not forget that both 32-bit C and Pascal knowledge was a rarer combination of skills in the early nineties than Pascal alone, since this was before Linux brought such skills more mainstream. Most users by far had only DOS affinity. This was aggrevated by GPC's pascal being the less popular dialect, and also GCC was not in the same shape it is today, specially on non-*nix it was useless to non-existent. (*) quality debugging being the last missing piece. And partially still, but we share the GDB use with GPC :-) (2b Choice for FSF procedures. I put this in parentheses because I actually don't know for certain if GPC does this. But FSF procedures require paperwork to be signed before you can submit non trivial patches.) 3. Transparancy of the project Despite being a "free" project, GNU even, GPC lacks some infrastructure for users to easily participate: - no user accessable repository (GPC works with patches. If I understood right each developer has an own set, that are slowly mutually absorbed). Note that I meant read-only repository access here. It is not directly clear what is the "newest", or what is bleeding edge or a (relatively stable) milestone. - no (open) bug repository. - no up to date docs, an occasional build instruction in maillist (site chronically outdated) for a system that is known to be hard to build. - _very_ few userready releases, lots of building by 3rd parties which are their own islands and lack of a(ny) central release coordination. - Looking at the site and public resources an user can't find the most recent sources. Even one that spends some work. He must subscribe to the mail list and ask, just to e.g. test. - (afaik) nothing but the most core RTL and compiler in the basic distribution. (see also point 4) - Currently (end 2015) the site's main page holds a 2005 copyright message. Nothing indicates it was updated since that date. I think though that some of the content like the todo list were partially updated, at least till 2007. The last patch set (recommended on the maillist anno 2012) is from 2007. - Being a GNU project and (probably?) need a signature before any significant contribution. (2b) Even despite being a gcc clone, this could have been eased a lot by some minor GPC-project specific infrastructural work (e.g. a base repository with patch sets to auto apply, and a few scripts to build the most current source and regularly put them online zipped) 4. Community Somehow a community around GPC never took off the way it did with FPC. All developers and the major users (usually release packagers) have some packages ported or created, but they are often outdated and hardly go beyond an initial release. There is no central place except an outdated page on the GPC site. If sites go down, packages might disappear. Luckily the core group is pretty persistent. Note that this reminds a bit of the situation with the FPC contributed page. This also can be seen in the GPC team size. 2 or 3 developers with another 2-3 package builders (for a certain OS/architecture) This is all a very gloomy picture, but of course there are exceptions, like a Mac maintainer temporarily shaking stuff up, and causing a major GPC uptake specially on Mac. However even then, such initiatives seem to fail to attract many people that stick to the project and do stuff, and rather attract people that try to keep aging Mac Pascal codebases running. OTOH FPC also had this with aging TP users, probably the reality is that you simply have to kiss a lot of frogs to find one prince. (or e.g. Free Basic with QB newbies) Conclusion One can discuss which reason is the major one. Some other FPC people repeatedly pointed to the "compiler in C" part and the immature build environment in the early days (and partially still) as the major reason by far. While I agree with these as major reason for GPC not having more "core" members (since they directly are touched by this), it can't be the only reason, unless you widen it with the general absence of easy to use, and somewhat regular releases. So that's why I think the other factors contributed significantly also, it has to, since if access to the core source mattered so much (closed or in another language is then not THAT different, in both cases there is a bump, either in time to learn the other language or the price), communities around closed platforms (think e.g. QB,TP,Delphi here) would be impossible. Some of these issues are easily observed by reading through maillists and newsgroups archives. Mails about GPC tend to be mainly about setting it up or standards compliancy, while on the FPC hand they are much more about adding functionality, libraries and general usage. The GPC questions on the Mac list though were more often practical, but the Mac maintainer actively supported installation of compiler into XCode/MW, made periodic releases etc. The GPC developers, in their exit message seemed to indicate the errorprone work on the compiler in plain C (pseudo OO with macros) and keeping up with gcc architectural changes as main factors. Conclusions for FPC -------------------- (some of this has been overtaken by history. Several packages/ packages, specially the fcl- ones are now decently and regularly maintained) Some of the community building problems are visible in FPC circles too, though luckily more in the fringes, outside the core distribution. Packages that are not absorbed into the global distribution (by either FPC and Lazarus) are far less widely used, and less frequently updated, unless shared with delphi The contributed units page is effectively a mess. Packages that are accepted but where the maintainer has fallen away (specially for old targets) are in a bad state too, though less outdated than the avg contrib units item. Lazarus-CCR also shows this pattern. The quality of contributed units entries varies wildly. (better after 2010) While the current distribution still grows, most of the growth is either evolution and consolidation of existing packages or rather riskless header translations. The major exception being the database related stuff, where Joost does absolutely great work, and the recent work of Sergei on the FCL-XML package. Also the fcl-passrc package has gotten quite some attention (probably mostly because it is actually used via fpdoc). FCL-Web/fpweb have gotten a lot of love from Michael and Joost too in recent times, but I'm not entirely convinced that this is as generally reusable as the other packages. (its more a specific implementation of a webframework as they see it) Some general ways to deal with certain important issues have been avoided, (though they were worked around for some projects). Most notably the issues are: - dealing with the MPL license (the bulk of open source Delphi code is under MPL). FPC got a bit more traction and it is easier to convince authors to duallicense, but it makes it hard to absorb abandoned codebases that are MPLed. When brought up in discussion, the "package system" is considered the cure-all, but it is still alpha level, and details pertaining to licenses aren't fleshed out yet. Still, the package system would at least make for a consistent source of metadata across packages. I hope it will be enough. (Lazarus?) - The same for BSD/MIT licenses that require some acknowledgement - dealing with large externally maintained packages. This is necessary because often these external packages are hard for a new user to get to work, and they often attract the "heavier" users with large Delphi codebases. A quite high percentage of these ends up contributing to FPC, directly, or indirectly (the latter being bugreports and package maintenance) "It's the developers, stupid!" (in this case package maintainers) - The FPC/Lazarus division is sometimes also annoying. Core classes are in FPC, but are sometimes mostly used by Lazarus. There is no logical place though to stuff lazarus based demonstrations of them though. - Currently there is only a fixes branch. There is constant tension over if this is a strict fixes only (as in really _proven_ bugfixes, like FreeBSD -RELEASE branches), or a general "-STABLE" branch (FreeBSD lingo again) that mirrors general development minus the risky stuff. Problem with the former (-RELEASE) is that, according to the rules, it would take several years to get anything new released (till the next major release). Also there is no testbed, and major releases are filled top till bottom with code that hasn't seen wide use (the beta's hardly go beyond the same inner crowd that also is capable of using SVN+snapshots). Major releases are now already adopted slower, and Lazarus has a nasty habit of ignoring all .0's. (which is bad, since they are our primary "customer", luckily that seems to have come to an end with 2.6.0) Problem of the latter (-STABLE) is that at some point people take merging too easily, and yes I am guilty of that too. The current practice is that the fixes branch is treated as -RELEASE as far as the compiler is concerned, and as -STABLE for the libraries/units Only real solution is multiple branches. But we lack manpower and the will to do that. The packages in packages/ were in a deplorable state. Luckily this has improved significantly with 2.2.0 and this continued with 2.2.2 and beyond, and while I worked hard on that myself, I was not the only one by far (A lot of people worked hard on it, specially Joost and Micha, Ivo and, more recently, Paul and Dmitry. And with Sergei on fcl-XML). On the other hand, new packages in packages/ that are one-off header translations that are never revised or updated are still being committed. Some other aspects as e.g. dynamic linking also have some of the characteristics of GPC's problem.