You know.. The default "load everything" method of file handling never made sense to me as far back as DOS, when "edit" refused to load anything over 64k... An add-on utility to replace Command.com, called 4DOS, included a "list" command, to enhance "more", which worked a bit more sanely imho, it loaded a set amount of the file into memory in a buffer, then only loaded new sections as needed. A lot of games, like the old Ultima series, used similar tricks to allow large worlds, by keeping the "currect" part of the world, and the two closest chunks to your location, loaded.
Point being? While coding it might get more complex, its not sane, if you are likely to have large files, but low memory, to try to load the entire thing, instead of buffering the section you will need next, before you need it. The same 1MB limit that existed could be split into two 512k sections, one for the "active" chunk, and the other for the "next" chunk, and still solve the problem, without adding a lot more overhead. A 2, 3, 4, etc. MB file is going to take more time to load anyway. While increasing the load size is bound to help, it is a lazy solution imho and not one that can't/won't cause problems the next time someone comes up with an even bigger file. As scary as it is to contemplate an even bigger one... lol |