David Kastrup wrote:
I don't quite see why the possibility of a workaround patching internals of LaTeX with at least adverse effects on possible file size should constitute reason not to bother about raising the limit.
well, since taco (who makes those final decisions now is away for two weeks) you need to apply the 'open source principle' and 'take the source, patch it and do your own good'; after all, it's the teams responsibility to make sure that pdftex is stable since the 2048 patch has always been there but for some reason was commented (i'm not going to repeat possible reasons) we will have to make sure nothing else (also for other platforms) breaks so, in any case, a patch has to wait till a next release (although of course akira is free to jump the number for his code branch) but even then any limit (2048) will be reached at some point (which btw is one of the reasons why context users have control over xform behaviour) anyhow, the xform primitive redefinition that martin posted is ok and rather harmless (if you include the same page from a document many times, you get a bigger file, but that's pretty each to catch in supervising macro code); also, i assume that latex has only a few places where xform inclusion is dealt with so patching cannot be that complex; you can even decide to apply martins hack selectively to your code (anyone dealing with huge pdf files and inclusions professionally will at some point need dedicated code to deal with it) also given the not so nice tone of this (also platform related) discussion i suggest we stop this thread Hans ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | fax: 038 477 53 74 | www.pragma-ade.com | www.pragma-pod.nl -----------------------------------------------------------------