Enhanced Random Search Optimization
|
01-15-2024, 11:48 AM
(This post was last modified: 01-15-2024 11:49 AM by Namir.)
Post: #1
|
|||
|
|||
Enhanced Random Search Optimization
Enhanced Random Search Optimization
========================= Random search optimizationn is very easy to code. This comes at the cost of producing results that are "close" to the actual answers. This HP Prime program implements an enhanced version of random search optimization. The program performs random searches in lower and upper bounds specified for each variable. The enhancement are as follows: 1. When each improved optimum point is found, the program performs a secondary (a.k.a. 'local') random search in the region around the improved optimum point. 2. After half of the number of iterations are performed, the program narrows the search region around the current best point. The program uses function RANDOMSEARCH() to perform the optimization. This function has the following parameters: 1) The parameter lb is a row vector that specifies the lower bounds for each variable. 2) The parameter ub is a row vector that specifies the upper bounds for each variable. 3) The parameter maxIt1 is the maximum number of 'global' random search iterations. 4) The parameter maxIt2 is the maximum number of 'local' random search iterations. The function performs maxIt2 iterations to search around each and every improved improved optimum point. The function returns the row matrix that contains the best "guesses" for the optimum values of the variables. The function FX contains the code for the optimized function. In the example below fx(x) = sum((x(i)-i)^2). To find the values for the four optimum variables for function FX and store the results in matrix M9, I specify the following arguments: 1) The lower bounds row vector of [0, 0, 0, 0]. 2) The upper bounds row vector of [5, 5, 5, 5]. 3) A maximum of 100000 global iterations. 4) A maximum of 1000 local iterations (performed EACH time a improved approximation for the optimum is found). M9 := RANDOMSEARCH([0,0,0,0],[5,5,5,5],100000,1000)00,1000) A sample output located in the first row of matrix M9 is: 0.995571271159, 2.00109660047, 2.98724324198, 4.0215903877 The actual exact results are: [1, 2, 3, 4] Remember since the optimization process heavily depends of random values, the results will vary each time you run the program. You can fine-tune the coded constants that shrink the search regions lb, ub, lb2, and ub2. You can also remove the parameter maxIt2 and declare it (in a LOCAL clause) inside the function as a fraction of parameter maxIt1. For example, you can use the statement maxIt2:=IP(maxIt1/10). Here is the listing for the program. Finally you edit the ivalue of the critical teration, m, that reduces the initial lower and upper bounds. For example, to set m to 75% of maxIt2 use m:=IP(3/4*maxIt1) or m:=IP(0.75*maxIt1). Code: EXPORT FX(x,n) |
|||
« Next Oldest | Next Newest »
|
User(s) browsing this thread: 2 Guest(s)