# Systematic Resampling Algorithm

So here I’m showing you that actual

wagon-wheel algorithm that we just talked about,

this systematic resampling. You start off with no new samples, and

you build your, the c’s are going to represent the outer ring,

the cumulative distribution function. And, essentially, each one represents the one before

it plus the extra new weight. Right?

So, you start off at the first weight, and then c2 is the sum of w1 and w2. C3 is the sum of w1, w2 and w3. That’s that outer ring, okay? You can think of it,

that if it goes from zero to one, because everything sums to one,

then the c’s are sort of how far, what percentage around

the clock have you gone? The next thing you do is you have

to pick, remember I told you, you, you have to sort of randomly throw

down the wagon-wheel to start with. Well if the wagon wheel

has eight spokes, the first spoke has to go somewhere

between zero and one-eighth. That’s what this is right here, right? N to the minus 1, 1,

remember you could write it as 1 over n. That’s the offset. And then what you do is you go around

the outer ring seeing, wait, did I pass my offset yet, as soon as I’ve passed

it, that’s the cdf part that I’m in. So just drawing that on here,

if this my first one, right? And I, and, and this is where it

starts on the outer ring I say, did I pass it yet? I say, nope. ‘kay, did I pass it yet? Yes, I did.

So this first spoke landed in w2. Then I just go to the next spoke, and

I say, did my last marker pass it? No, I gotta go to the next one. Did this pass it? Yes, so my next one is from w3. So that’s all I do, is I, I initialize

it and then I just count up for however many samples that I need,

I go and I grab the next wagon-wheel chunk and

I see where on the cdf that it lands. And then I just return that set. It, it looks more

complicated than it is and it’s a way of doing that sampling in,

in linear time.