Re: [NTG-pdftex] [tex-live] Runtime limitations on open files?
Philip TAYLOR writes:
[...] there are ten million computers in the world running *X, and each and every one of them has different libraries, different compilers, different versions, different this, different that, different everything.
I'm using Gentoo Linux, a source code distribution, and I'm glad to see how good it works. I'm not a C programmer and I'll certainly have a problem if not everything compiles without manual interaction. But everything works perfectly. Yes, there are many different libraries with different version numbers. But it doesn't matter because programs usually know which (versions of) libraries they need. There are different versions of .DLL files on Windows too, unfortunately the file name doesn't contain the version number. If people always use standard tools to install software, the installer certainly looks into the .DLL file and prevents a recent version from being replaced by an older one. But there are people you called monkeys in a previous mail. And there are many monkeys in the Windows world. I often get CDs containing data sheets. They usually contain PDF files. Some of them come with a toc file in HTML, but some require that you have to install them, whatever this means. Some even require that you have admin rights. These CDs go into the trash can immediately. Either they want to break my system deliberately, what I don't believe because they want me to buy their products, or they are what you call monkeys. I doubt that anybody who doesn't know what an admin account is good for is able to install software on my system properly. There are probably monkeys in the UNIX world too, but I think that the fact that shared libraries provide the version number in the file name is much more robust than the Windows approach (which is definitely not monkey proof). What you said about UNIX was an assumption. If you have some time, why not install a Linux system, maybe on a virtual machine, and play with it? You'll see that all your assumptions are wrong. There are many Linux distributions available but I think that Gentoo is the best one for you because, as far as I know you, it's imortant for you to know how things work. You have to do things manually while other distributions provide menus. But you immediately see how things work. I really hope that you find some time to play with it.
Yes, there is a ?registry? tweak that will stop explorer from trying to enumerate remote (networked) files; it is a pain, and I completely agree. When I find the reference, I'll forward it to you (and your administrators !).
That would be nice. The problem exists on XP but not on 9x. My first assumption was that it is a DNS problem but specifying the IP number instead of the hostname didn't help. I now also believe that there is something wrong either in the network setup or the registry. A colleague already looked into the network setup and found a few improvements but the main problem still exists. Fabrice said that the registry is a distributed database, so I assume that I can't make a backup before I edit it. Hence, I'm interested in some hints from people who know what has to be changed. Making own experiments is too dangerous. I'm on vacation at the moment, but I'll come back to the issue later.
P.S. About as "intuitive" as [con]cat[enating] a file with nothing in order simply to see it on the screen ...
You *can* [ab]use 'cat' to print a file to screen but the UNIX tool designed for this task is 'more' and not 'cat'. There is also 'less', which allows you to scroll backwards but it's probably not installed on every UNIX system. Regards, Reinhard -- ---------------------------------------------------------------------------- Reinhard Kotucha Phone: +49-511-4592165 Marschnerstr. 25 D-30167 Hannover mailto:reinhard.kotucha@web.de ---------------------------------------------------------------------------- Microsoft isn't the answer. Microsoft is the question, and the answer is NO. ----------------------------------------------------------------------------
2007/8/19, Reinhard Kotucha
Philip TAYLOR writes:
[...] there are ten million computers in the world running *X, and each and every one of them has different libraries, different compilers, different versions, different this, different that, different everything.
I'm using Gentoo Linux, a source code distribution, and I'm glad to see how good it works. I'm not a C programmer and I'll
Please stop this discussion now. It has totally left its subject and I don't think anybody here is really interested in a Holy war. Windows exists and has users who want to use TeX. We have to support them. The same is true for Unix. EOD. Best Martin
"Martin Schröder"
Please stop this discussion now. It has totally left its subject and I don't think anybody here is really interested in a Holy war.
Windows exists and has users who want to use TeX. We have to support them.
So can we have the one-liner extending the number of open files to 2048? And as a bonus point, can somebody tell me how I get a working Windows executable once this has been done in the source? -- David Kastrup, Kriemhildstr. 15, 44793 Bochum
"Martin Schröder"
2007/8/19, David Kastrup
: So can we have the one-liner extending the number of open files to 2048?
When you tell us why making \pdfximage non-\immediate doesn't help
You mean, making it \immediate.
you, i.e. why you need the change.
Queens don't make bargains. More seriously: I am in maintenance mode here, trying to get a large XMLTeX based application to run at a customer site with minimal changes. I will do a code audit tomorrow in order to figure out what may be possible there. Since do a heavy amount of PDFTeX specific manipulation, I doubt that a simple-minded change like the one you proposed will not break things and/or blow up the size of documents (and they are already in the Gigabyte range). I don't quite see why the possibility of a workaround patching internals of LaTeX with at least adverse effects on possible file size should constitute reason not to bother about raising the limit. -- David Kastrup, Kriemhildstr. 15, 44793 Bochum
David Kastrup wrote:
I don't quite see why the possibility of a workaround patching internals of LaTeX with at least adverse effects on possible file size should constitute reason not to bother about raising the limit.
well, since taco (who makes those final decisions now is away for two weeks) you need to apply the 'open source principle' and 'take the source, patch it and do your own good'; after all, it's the teams responsibility to make sure that pdftex is stable since the 2048 patch has always been there but for some reason was commented (i'm not going to repeat possible reasons) we will have to make sure nothing else (also for other platforms) breaks so, in any case, a patch has to wait till a next release (although of course akira is free to jump the number for his code branch) but even then any limit (2048) will be reached at some point (which btw is one of the reasons why context users have control over xform behaviour) anyhow, the xform primitive redefinition that martin posted is ok and rather harmless (if you include the same page from a document many times, you get a bigger file, but that's pretty each to catch in supervising macro code); also, i assume that latex has only a few places where xform inclusion is dealt with so patching cannot be that complex; you can even decide to apply martins hack selectively to your code (anyone dealing with huge pdf files and inclusions professionally will at some point need dedicated code to deal with it) also given the not so nice tone of this (also platform related) discussion i suggest we stop this thread Hans ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | fax: 038 477 53 74 | www.pragma-ade.com | www.pragma-pod.nl -----------------------------------------------------------------
Hans Hagen
David Kastrup wrote:
I don't quite see why the possibility of a workaround patching internals of LaTeX with at least adverse effects on possible file size should constitute reason not to bother about raising the limit.
well, since taco (who makes those final decisions now is away for two weeks) you need to apply the 'open source principle' and 'take the source, patch it and do your own good';
Oh, I would if I could get it to compile... But it would seem that the most viable way to do this for me would be to patch the binary. It is even conceivable that getting a cross-compilation environment to work would be all that would be required here, without any patch of the source: it is not win32 per se that has the limitation, but rather the used C library runtime, and this runtime might well be different in a cross-compilation setting. -- David Kastrup, Kriemhildstr. 15, 44793 Bochum
Martin Schröder writes:
When you tell us why making \pdfximage non-\immediate doesn't help you, i.e. why you need the change.
I already experimented with \immediate but in my script it doesn't make any difference. Maybe I overlooked something. If you want to try yourself, download http://ms25.ath.cx/pdfcatdir Run ./pdfcatdir --help for more information. If you want to play with \immediate, you don't have to modify pdfcatdir itself. Just run pdfcatdir --debug and the generated .tex file will not be removed. You can edit it and then run pdftex -ini <dirname>.tex When I replace \pdfximage by \immediate\pdfximage I don't see any difference in the output of strace -f -e open,close pdfcatdir somedir Regards, Reinhard -- ---------------------------------------------------------------------------- Reinhard Kotucha Phone: +49-511-4592165 Marschnerstr. 25 D-30167 Hannover mailto:reinhard.kotucha@web.de ---------------------------------------------------------------------------- Microsoft isn't the answer. Microsoft is the question, and the answer is NO. ----------------------------------------------------------------------------
2007/8/20, Reinhard Kotucha
I already experimented with \immediate but in my script it doesn't make any difference. Maybe I overlooked something.
The reason is your use of -initex: void deleteimage(integer img) { if (iniversion) return; // The image may be \dump{}ed to a format Best Martin
participants (4)
-
David Kastrup
-
Hans Hagen
-
Martin Schröder
-
Reinhard Kotucha