Skip to content

Maximum Good Subarray Sum⚓︎

Link

Solution⚓︎

See reference (Chinese).

Prefix sum:

\[ s\left[ 0 \right] =0 \]
\[ s\left[ 1 \right] =nums\left[ 0 \right] \]
\[ s\left[ 2 \right] =nums\left[ 0 \right] +nums\left[ 1 \right] \]
\[ \vdots \]
\[ s\left[ i \right] =nums\left[ 0 \right] +\cdots +nums\left[ i-1 \right] \]
\[ s\left[ i+1 \right] =nums\left[ 0 \right] +\cdots +nums\left[ i \right] =\sum_{j=0}^i{nums\left[ j \right]} \]
\[ \sum_{k=l}^r{nums\left[ k \right]}=\left( nums\left[ 0 \right] +\cdots +nums\left[ r \right] \right) -\left( nums\left[ 0 \right] +\cdots +nums\left[ l-1 \right] \right) =s\left[ r+1 \right] -s\left[ l \right] \]

We want to find:

\[ \max \left( s\left[ j+1 \right] -s\left[ i \right] \right) , s.t.\left| nums\left[ i \right] -nums\left[ j \right] \right|=k \]

By iterating over j, we need to calculate the minimum value of s[i], given nums[i] == nums[j] - k or nums[i] == nums[j] + k.

Define a hash map, where the key is nums[i], and the value is the minimum value of s[i] for the same nums[i].

Note: Directly calculating prefix sums will result in TLE. We have to dynamically maintain the prefix sum.

class Solution {
public:
    long long maximumSubarraySum(vector<int>& nums, int k) {
        long long res = LLONG_MIN, sum = 0;
        unordered_map<int, long long> minS;

        for (int x : nums) {
            auto it = minS.find(x + k);
            if (it != minS.end()) {
                res = max(res, sum + x - it->second);
            }

            it = minS.find(x - k);
            if (it != minS.end()) {
                res = max(res, sum + x - it->second);
            }

            it = minS.find(x);
            if (it == minS.end() || sum < it->second) {
                minS[x] = sum;
            }
            sum += x;
        }
        return res == LLONG_MIN ? 0 : res;
    }
};
  • Time complexity: \(O(n)\);
  • Space complexity: \(O(n)\).

Note: Python solution for reference:

class Solution:
    def maximumSubarraySum(self, nums: List[int], k: int) -> int:
        ans = -inf
        min_s = defaultdict(lambda: inf)
        s = 0
        for x in nums:
            ans = max(ans, s + x - min(min_s[x - k], min_s[x + k]))
            min_s[x] = min(min_s[x], s)
            s += x
        return ans if ans > -inf else 0