Hoi,
Pootle is only one tool that can be used by translators. Particularly professional tools have their own tools to do this work. Localising in a web interface takes substantially more time then working off line. When you are not working with professionals, it makes sense to optimise an environment that is best suited for the community involved. I have hear many translators resent projects like Pootle because they cannot benefit from their machine translations. I know translators that refuse to work online; as it is not efficient for them the cost is too high.
For MediaWiki, the localisation is done in
BetaWiki. This is an environment optimised for the MediaWiki community. Its software has become quite sophisticated and includes management tools for the transfer to the MediaWiki SVN. It is helping the localisation of MediaWiki really well. Currently there are
many more languages supported in BetaWiki then in Pootle. For professional translators we have an export in the "gettext" format, it includes both the to be translated text, the translated text and usage information. This can be imported in tools like OmegaT, a professional translation tool licensed under the GPL.
The biggest problem for many of the less resourced languages is not localisation. It is recognition. It is not possible in any tool that I know off to indicate all the written languages that are recognised languages according to the ISO-639-3. The consequence is that you cannot even indicate from within your tooling that you are writing your own language. Consequently much of the Internet provides META information that is completely wrong. It is so bad that you can nor rely on meta data to know what language is used. Sorry for this rant

but if you are interested in solving this issue please contact me.
Thanks,
Gerard
On Jan 11, 2008 12:11 PM, Alexander Todorov <atodorov@redhat.com (
atodorov@redhat.com)> wrote: