Algorithms in Go: Sliding Window Pattern

Let's consider the following problem: we have an array of integers and we need to find out the length of the smallest subarray the sum of which is no less than the target number. If we don't have such a subarray we shall return -1.

We can start with a naive approach and consider every possible subarray in the input:

  1. start with an item at index 0.

  2. grow the subarray by one.

  3. if the accumulated sum is equal or greater to the target number, save the length of the subarray.

  4. continue growing the subarray till we reach the last item.

  5. move the start of the subarray to the next item at index 1 and repeat the procedure.

Let's encode the above algorithm in Go:

for start := 0; start < len(arr); start++ {
	var acc int
	for stop := start; stop < len(arr); stop++ {
		acc += arr[stop]
		if acc >= target {
			minLn = min(minLn, stop-start+1)
		}
	}
}

As Go doesn't have generics (yet), we have to write function min manually.

What about minLn constant? We would like to start with an initial value equal to plus infinity. However, taking into consideration that we are still mere humans, we have to settle down on some finite approximation of infinity. An interesting fact about Go is that we actually can have constants that are "arbitrary" big, for example, this

const inf = 1 << 65 // 2 in power 65

and even this:

const inf = 1 << 165 // 2 in power 165

are perfectly fine constants.

In our use case, however, we will be just fine with the maximum value for an int64. Any of these will do:

minLn := 1 << 63 - 1
minLn := math.MaxInt64

In the following articles, we discuss more deeply subtle differences in Go in regards to integer constants and variables and why we guarded word arbitrary with quotes in the above paragraph.

We also have to check whether we able to accumulate the required sum at all and if not return -1

if minLn == 1<<63 -1 {
		return -1
}
return minLn

A clear advantage of the above implementation is that we visibly cover the whole space of options and can be certain that we get the intended result. The cost of this benefit is time complexity: we have two for loops, so we end up with the quadratic time complexity.

Can we do better than that?

Oh, yes. The Sliding Window Pattern could make the wonder. The basic idea is the following: we start "consuming" the array and accumulate the sum. We stop when the accumulated sum is greater or equal to the target number. And then we try to shrink the subarray from the start to obtain the minimum length of a subarray at this stage. Then we move the end of the subarray to the right i.e. processing the next number in the array) and repeat the whole process.

We can encode the algorithm in Go:

var start, stop, acc int
minLn := 1<<63 - 1

for stop < len(arr) {
   for stop < len(arr) && acc < target {
      acc += arr[stop]
      stop++
   }
   if acc < target {
      break
   }

   for start <= stop && acc-arr[start] >= target {
      acc -= arr[start]
      start++
   }

   ln := stop - start
   minLn = min(ln, minLn)

   acc -= arr[start]
   start++
}

if minLn == 1<<63-1 {
   return -1
}
return minLn

Ok, did we manage to improve the time complexity? Well, we iterate through the whole array and we add each item (and later discard it) exactly once, so now we have a linear time complexity.

Here you can find the code and tests.

Similar posts

Ads
AdBlock has stolen the banner, but banners are not teeth — they will be back

More

Comments 0

Only users with full accounts can post comments. Log in, please.