Particle Swarm Optimization (PSO)


Wikipedia defines PSO as

In computer science, particle swarm optimization (PSO) is a computational method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. PSO optimizes a problem by having a population of candidate solutions, here dubbed particles, and moving these particles around in the search-space according to simple mathematical formulae over the particle's position and velocity. Each particle's movement is influenced by its local best known position but, is also guided toward the best known positions in the search-space, which are updated as better positions are found by other particles. This is expected to move the swarm toward the best solutions.

Now we will see how to implement it in c.

STEP 1: Initialize swarm positions


We need to define 'N' number of particles in the swarm (search) space. This is done using the following equation. Here, 'Xi' refers to the variables in function 'f' whose values we are to optimize, 'rand' refers to an uniformy distributed random variable. In the source code given below, we have three variables to optimize. We must perform this N times to get 'i' coordinates of 'N' particles in the search space.

XiK = XiK(min) + rand (XiK(max) - XiK(min))

STEP 2: Initial velocity of particles and local minimum


We know velocity is given by position change per unit time. We set initial velocity to zero for convenience, but if we are to restart the search from previously located position, this must be updated. For all the particles in the swarm space, the fitness function is computed, which is just the function we are to optimize. Usually this algorithm works best for minimization, thus it is imperative to convert out problem into a minimization problem (like computing an error term and minimizing it). From this the particle representing the local minima is obtained and made the leader for next step. (akin to a goose in a flock that is closest to the food particle). This is also set as the global minimum for now and will be updated in every iteration with the best of the local minima.

f (loc. min.) = Pl ( = pg (first step only))
if Pl < Pg(old) then Pl = Pg


STEP 3: Update velocity of particles


having set the initial velocity to zero, we are to find the velocities of the particles for each timestep 'k'. We can set each timestep as 1 for convenience. Here, the velocity update drives the particles towards the global minimum. C1 and C2 are the self and swarm confidence ratios, which can be set between 0 and 2 to favor local or global minimum. The velocity update equation is given by the following equation.

VK+1 = C1 rand (Pl - Xik) + C2 rand (Pg - Xig)

STEP 4: Update new positions of the particles


The new position of the particles is updated by adding the velocities to the previous positions.

XK+1 = XK + VK+1

STEP 5: Iterate


Go to STEP 2: and iterate this procedure until the error convergence is achieved. The major advantage of this method is that the minimum is acheived just after the first particle reaches the optimum, unlike genetic algorithm where all of the genes must reach optimum. PSO is a lot faster than GA.

GOTO STEP:2



The Source code for an arbitrary function



  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
//##################################################
// A serial PSO implementation
// Copyright 2009 Suresh Kondati natarajan
// www.sureshaa.me.pn
//##################################################

#include<stdio.h>
#include<math.h>
#include<stdlib.h>
#include<time.h>

int main()

{

float randomvariable(void); // for random numbers
float a[10][10],b[10][10],c[10][10],d[10][10],x[10][10],y[10][10],z[10][10]; 
// static arrays to store the initial points and the trajectory
float var,var1,fmin,fmax,cmin,cmax,smin,smax;
int i=0;
float c1,c2;
int j,q,u,v,k,r;

//input bounds for the variables in the function to be optimized.
//
//

printf("Enter the initial and final values of cutting speed\n");
scanf("%f %f",&cmin,&cmax);
printf("Enter the initial and final values of workhead speed\n");
scanf("%f %f",&smin,&smax);
printf("Enter the initial and final values of fine feed rate\n");
scanf("%f %f",&fmin,&fmax);
printf("Enter the self confidence ratio(0-2)\n");
scanf("%f",&c1);
printf("Enter the swarm confidence ratio(0-2)\n");
scanf("%f",&c2);

// swarm initialization, function is hardcoded in d[i][j]. 
// Can be modified for different functions. Initial position of 
// points (10) as indicated in the static matrix allocation are 
// given by the following block.
//

for(j=0;j<10;j++)
{
	var=randomvariable();
	printf("var %f \n",&var);
	a[i][j]=cmin+var*(cmax-cmin);
	var=randomvariable();
	b[i][j]=smin+var*(smax-smin);
	var=randomvariable();
	c[i][j]=fmin+var*(fmax-fmin);
	d[i][j]=(-47.0669+(0.1633*a[i][j])-(0.978167*b[i][j])+
	|| (54.0917*c[i][j])-(0.000084*a[i][j]*a[i][j])-
	|| (13.5625*c[i][j]*c[i][j])+(0.00065*a[i][j]*b[i][j])-
	|| (0.023*a[i][j]*c[i][j]));
	printf("%f %f %f %f\n",a[i][j],b[i][j],c[i][j],d[i][j]);
}

//search for global (local)  minimum setting
//
//

printf("this finds the global minimum\n");

iteration:
j=0;
	for(k=1;k<10;k++)
	{
		if(d[i][j]<d[i][k])
		{
		q=j;
		}
		else
		{
		j=k;
		q=j;
		}
	}

printf("global minimum values\n");
printf("x1b x2b x3b fxb\n");
printf("%f %f %f %f\n",a[i][q],b[i][q],c[i][q],d[i][q]);

//update the velocity of the particles such that the particles 
//moves towards the point with global (local)  minimum
//
//

printf("This step updates velocity\n");
printf("v1 v2 v3\n");

	for(j=0;j<10;j++)
	{
	u=0;
		for(v=0;v<=i;v++)
		{
			if(d[u][j]<d[v][j])
			{
			r=u;
			}
			else
			{
			u=v;
			r=v;
			}
		}

	var=randomvariable();
	var1=randomvariable();
	x[i][j]=(c1*var*(a[r][j]-a[i][j]))+(c2*var1*(a[i][q]-a[i][j]));
	var=randomvariable();
	var1=randomvariable();
	y[i][j]=(c1*var*(b[r][j]-b[i][j]))+(c2*var1*(b[i][q]-b[i][j]));
	var=randomvariable();
	var1=randomvariable();
	z[i][j]=(c1*var*(c[r][j]-c[i][j]))+(c2*var1*(c[i][q]-c[i][j]));
	printf("%f %f %f\n",x[i][j],y[i][j],z[i][j]);
	}

//updated position by adding the computed velocities to the previous positions
//
//

printf("New updated position\n");
printf("x1 x2 x3 fx\n");
i=i+1;
	for(j=0;j<10;j++)
	{
	a[i][j]=a[i-1][j]+x[i-1][j];
	b[i][j]=b[i-1][j]+y[i-1][j];
	c[i][j]=c[i-1][j]+z[i-1][j];
		if(a[i][j]>cmax)
		{
		a[i][j]=cmax;
		}
		if(b[i][j]>smax)
		{
		b[i][j]=smax;
		}
		if(c[i][j]>fmax)
		{
		c[i][j]=fmax;
		}
		if(a[i][j]<cmin)
		{
		a[i][j]=cmin;
		}
		if(b[i][j]<smin)
		{
		b[i][j]=smin;
		}
		if(c[i][j]<fmin)
		{
		c[i][j]=fmin;
		}

	d[i][j]=(-47.0669+(0.1633*a[i][j])-(0.978167*b[i][j])+(54.0917*c[i][j])-
	|| (0.000084*a[i][j]*a[i][j])-(13.5625*c[i][j]*c[i][j])+
	|| (0.00065*a[i][j]*b[i][j])-(0.023*a[i][j]*c[i][j]));
	printf("%f %f %f %f\n",a[i][j],b[i][j],c[i][j],d[i][j]);
	}

//epoch continuation
//
//

if( i<6 )
{
goto iteration;
}
else

//Print final optimal values
//
//

{
printf("the iterations are completed\n");
printf("the optimum values are\n");
printf("x1b x2b x3b fxb\n");
printf("%f %f %f %f\n",a[i][q],b[i][q],c[i][q],d[i][q]);
}

}

//A subroutine to get random number from system clock
//
//

float randomvariable(void)
{
float k;
k=rand();
while (k>=1)
{
k=0.0001*k;
}
return k;
}