Optimization using random search is an algorithm that is easy to code. This approach searches for the best values of the independent variables, each within a trusted range. The trick is to use a large number of iterations. The best solution tends to deviate from the true solutions as the number of variables increase and as the optimized function is more complex to evaluate.
There are a few approaches that I recently used to possibly enhance random searching:
1) If the independent variables can be grouped by categories (such as temperature, pressure, concentration, length, and so on), then you can add more sets of searches in the main iterations loop. Each set obtains random values for variable in a specific category, while setting the other variables to their best values obtained so far.
2) If you notice, while running tests on your problem, that the values of the independent variables oscillate above and below their correct optimum values, then you can add another set of calculations to the main loop. This additional set looks at the averages of the best values obtained so far for each independent variable. It then generates random number that are in ranges of, say, 20% or 30% around the averages. You have the option of including the values of independent variables that produce better minima in the running totals of the best independent variable values. This step will tend to favor the average best values of the independent variables. However, remember that each loop has the free hand to pick at random values for the independent variables, each within its prescribed range.
Here is HP Prime code that illustrates the above schemes.
Function f1 can handle any number of variables. I recommend testing it with three variables. I suggest a call like MC1(10000,[0 0 0],[5 5 5],3).
Function f2 is coded to handle four variables. I suggest a call like MC2B(10000,[50 50 0 0],[300 300 2 2],4) and MC3(10000,[50 50 0 0],[300 300 2 2],4).
Code:
EXPORT f1(x, numvars)
BEGIN
LOCAL i,s,z;
s:=0;
FOR i FROM 1 TO numvars DO
z:=i+(i+1)/10+(i+2)/100+(i+3)/1000;
s:=s+(x[i]-z)^2;
END;
RETURN s;
END;
// Simple random search. Optimizes function f1().
//
EXPORT MC1(maxit,xlow,xhi,numvars)
BEGIN
LOCAL iter,i,x,fx,bestfx,bestx;
// create local vectors
bestx:=MAKEMAT(0,numvars);
x:=MAKEMAT(0,numvars);
// get initial best solution
FOR i FROM 1 TO numvars DO
bestx[i]:=RANDOM(xlow[i],xhi[i]);
END;
bestfx:=f1(bestx, numvars);
// start main loop
FOR iter FROM 1 TO maxit DO
FOR i FROM 1 TO numvars DO
x[i]:=RANDOM(xlow[i],xhi[i]);
END;
fx:=f1(x,numvars);
IF fx<bestfx THEN
bestfx:=fx;
FOR i FROM 1 TO numvars DO
bestx[i]:=x[i];
END;
END;
END;
RETURN bestx;
END;
// Random search uses average of best values
// to enhance search. Optimizes function f1().
EXPORT MC2(maxit,xlow,xhi,numvars)
BEGIN
LOCAL iter,i,x,fx,bestfx,bestx;
LOCAL sumx, count;
// create local vectors
bestx:=MAKEMAT(0,numvars);
x:=MAKEMAT(0,numvars);
sumx:=MAKEMAT(0,numvars);
count:=0;
// get initial best solution
FOR i FROM 1 TO numvars DO
bestx[i]:=RANDOM(xlow[i],xhi[i]);
END;
bestfx:=f1(bestx, numvars);
// start main loop
FOR iter FROM 1 TO maxit DO
// general random search
FOR i FROM 1 TO numvars DO
x[i]:=RANDOM(xlow[i],xhi[i]);
END;
fx:=f1(x,numvars);
IF fx<bestfx THEN
bestfx:=fx;
FOR i FROM 1 TO numvars DO
bestx[i]:=x[i];
sumx[i]:=sumx[i]+x[i];
END;
count:=count+1;
END;
// search using averages of best points
IF count>0 THEN
FOR i FROM 1 TO numvars DO
x[i]:=RANDOM(0.95*bestx[i],1.1*bestx[i]);
END;
fx:=f1(x,numvars);
IF fx<bestfx THEN
bestfx:=fx;
FOR i FROM 1 TO numvars DO
bestx[i]:=x[i];
END;
END;
END;
END;
RETURN bestx;
END;
EXPORT f2(x, numvars)
BEGIN
LOCAL i, s;
LOCAL a0, b0, k1, k2;
a0:=100;
b0:=200;
k1:=1;
k2:=1.5;
s:=(x[1]-a0)^2+(x[2]-b0)^2+(x[3]-k1)^2+(x[4]-k2)^2;
RETURN s;
END;
// Version of MC2 that optimizes function f2().
EXPORT MC2B(maxit,xlow,xhi,numvars)
BEGIN
LOCAL iter,i,x,fx,bestfx,bestx;
LOCAL sumx, count;
// create local vectors
bestx:=MAKEMAT(0,numvars);
x:=MAKEMAT(0,numvars);
sumx:=MAKEMAT(0,numvars);
count:=0;
// get initial best solution
FOR i FROM 1 TO numvars DO
bestx[i]:=RANDOM(xlow[i],xhi[i]);
END;
bestfx:=f2(bestx, numvars);
// start main loop
FOR iter FROM 1 TO maxit DO
// general random search
FOR i FROM 1 TO numvars DO
x[i]:=RANDOM(xlow[i],xhi[i]);
END;
fx:=f2(x,numvars);
IF fx<bestfx THEN
bestfx:=fx;
FOR i FROM 1 TO numvars DO
bestx[i]:=x[i];
sumx[i]:=sumx[i]+x[i];
END;
count:=count+1;
END;
// search using averages of best points
IF count>0 THEN
FOR i FROM 1 TO numvars DO
x[i]:=RANDOM(0.95*bestx[i],1.1*bestx[i]);
END;
fx:=f2(x,numvars);
IF fx<bestfx THEN
bestfx:=fx;
FOR i FROM 1 TO numvars DO
bestx[i]:=x[i];
END;
END;
END;
END;
RETURN bestx;
END;
// Random search method that optimizes
// fucntion f2() and performs special
// random search for group x[1] and x[2]
// and also for group x[3] and x[4]
EXPORT MC3(maxit,xlow,xhi,numvars)
BEGIN
LOCAL iter,i,x,fx,bestfx,bestx;
// create local vectors
bestx:=MAKEMAT(0,numvars);
x:=MAKEMAT(0,numvars);
// get initial best solution
FOR i FROM 1 TO numvars DO
bestx[i]:=RANDOM(xlow[i],xhi[i]);
END;
bestfx:=f2(bestx, numvars);
// start main loop
FOR iter FROM 1 TO maxit DO
FOR i FROM 1 TO numvars DO
x[i]:=RANDOM(xlow[i],xhi[i]);
END;
fx:=f2(x,numvars);
IF fx<bestfx THEN
bestfx:=fx;
FOR i FROM 1 TO numvars DO
bestx[i]:=x[i];
END;
END;
// randomizes variables x[1] and x[2]
FOR i FROM 1 TO numvars DO
IF i<=2 THEN
x[i]:=RANDOM(xlow[i],xhi[i]);
ELSE
x[i]:=bestx[i];
END;
END;
fx:=f2(x,numvars);
IF fx<bestfx THEN
bestfx:=fx;
FOR i FROM 1 TO numvars DO
bestx[i]:=x[i];
END;
END;
// randomizes variables x[3] and x[4]
FOR i FROM 1 TO numvars DO
IF i>2 THEN
x[i]:=RANDOM(xlow[i],xhi[i]);
ELSE
x[i]:=bestx[i];
END;
END;
fx:=f2(x,numvars);
IF fx<bestfx THEN
bestfx:=fx;
FOR i FROM 1 TO numvars DO
bestx[i]:=x[i];
END;
END;
END;
RETURN bestx;
END;