How much memory may a HPPL program use on a G2?
|
09-13-2023, 12:09 PM
Post: #52
|
|||
|
|||
RE: How much memory may a HPPL program use on a G2?
Hi Jeff,
(09-12-2023 10:47 PM)jte Wrote: There’s certainly many algorithms for subset checking, string searching, etc. but one thing this reminded me of was my dad’s program (that he did for a comp. sci. assignment in the ’60s, on punched cards — he told me about it several times…) for handling pizza toppings. His idea was to encode each pizza topping option as a prime so that a selection of pizza toppings could be encoded as a product of those selected.My respects to your dad. In those times, with such limited possibilities, coming up with and additionally implementing something like that was certainly a great achievement. (09-13-2023 05:03 AM)jte Wrote: Another approach to handle blocks of characters when checking for matches [...] that follows more traditional computer-y approaches would be to use 5 bits per character (assuming <= 32 different characters) for the encoding [...] and then use a mask to select which characters must match specific characters.At first, I was thinking about such an approach, but in this case, we are dealing with the French language, which includes accents. So, this approach would require us to maintain two dictionaries because converting text to binary for a 9-letter dictionary with over 50,000 words would take a very long time with each run (involving a lot of bitwise AND/OR/BITSL operations). Additionally, it's not feasible to store such a dictionary directly in binary form because 5 bits per character cannot accommodate 39 combinations for letters (26 + 13 accents). This would require 6 bits per word, limiting the maximum word length to 10 letters when using 64-bit integers. Instead, I opted for lists based on text strings, which are very fast and have no restrictions on both word length and list length. However, if we assume that words have no accents and have a maximum of 12 characters, your solution could indeed be faster. Later, I realized that such iteration was also unnecessary because since I have a sorted dictionary, I could implement binary search (reducing the search range). This greatly sped up the program, and now there are a maximum of a few dozen comparisons across the entire dictionary while searching for the first word, so the comparison method has become less critical. Best regards, Piotr |
|||
« Next Oldest | Next Newest »
|
User(s) browsing this thread: 19 Guest(s)