On Fri, 17 Aug 2007, Taco Hoekwater wrote:
David Kastrup wrote:
I can't see any downsides to that, anyway. Might be nice to stop leaving files open indiscriminately, but this stopgap measure appears like an easy copout at the moment.
As stopgaps go, this is a pretty decent solution, I think. In general, simply closing the image files after the information discovery and then reopening them for inclusion won't do, as the file may be messed up externally in the time between those two events.
when the files are first opened, only very limited info is extracted (type, dimensions, density) and stored. The rest doesn't matter. So if one opens a file a second time for inclusion, only these stored parameters need to be checked against the ones in the file (just read parameters a 2nd time) for consistency, then nothing can be messed up, even if the image contents differs (the new image would be included). An error (warning?) would be given only if the dimension data don't match. Would it be critical for the image contents, if it has changed in the meantime? Is it needed to nail down the image by keeping the file open? Are images changing dynamically during a session? Maybe then it might be natural anyway to take the freshest one?
On almost all systems, keeping an open file handle protects you against exactly that.
Regards, Hartmut