How much memory may a HPPL program use on a G2?
|
09-17-2023, 09:13 PM
Post: #64
|
|||
|
|||
RE: How much memory may a HPPL program use on a G2?
(09-17-2023 08:09 AM)komame Wrote: ⋮ One possibility I was going to mention, when reading the following, was that multiple sort orders could be used so that binary search could be employed even when the search pattern started with “?”. (N sort orders for N-letter words would be the obvious one, although letter pairs come next to mind [although only half of those would be needed — N(N-1)/2 instead of N(N-1). If that’s too much precomputation, letter clumps are another approach – this could still help limit the linear traversal.) (09-13-2023 12:09 PM)komame Wrote: ⋮ But I didn’t write that earlier as it seemed precomputation wasn’t of direct interest, based on the following: (09-13-2023 12:09 PM)komame Wrote: ⋮ My first thought with these sorts of things is that there are distinct phases. Phase 1 is setting things up: getting the dictionary, getting it into the program [writing the program…], setting up search data structures, etc. — the performance clock isn’t engaged yet! Phase 2 is benchmarking: running the code against a performance meter. (So, with the example above, the “converting text to binary” isn’t part of “benchmark time”; it would not be done with each run, but be part of “set up”.) Hmmmm… if the sorting is being done during “benchmark time”, perhaps an alternative sort could still be used, based on the search string. (But perhaps keeping the dictionary sorted [with a conventional sort] is allowed as precomputation / considered to be a fundamental part of what it means to be “a dictionary”.) It’s nice to see thought-provoking programs! |
|||
« Next Oldest | Next Newest »
|
User(s) browsing this thread: 18 Guest(s)