Editorial for Hardcore Grinding


Remember to use this editorial only when stuck, and not to copy-paste code from it. Please be respectful to the problem author and editorialist.
Submitting an official solution before solving the problem yourself is a bannable offence.

Author: magicalsoup

This is a simpler version of the classic task-scheduling problem.

The original problem is very similar, given N tasks, each with a start time s_i and a finish time f_i, what is the minimum amount of machines needed to accomplish all N tasks, if each machine can only do 1 task at a time.

The solution is to first sort the tasks by their start time, which this problem has already provided. Then we just need to find the time with the most overlaps.

To find the most overlaps, think of an interval s_i, f_i, as adding a 1 (task) throughout the interval. Then this resembles the difference array update problem. We can use a difference array to keep track of how many overlaps we have, by simply adding one at diff[s_i] and subtracting one at diff[f_i]. Notice how it is f_i and not f_i+1, this is because a task only occupies the range [s_i, f_i). Then simply run prefix sum, then find the max value in the difference array, that will be the maximum number of overlaps, and the minimum amount of machines needed.

Also keep in mind that the ranges s_i and f_i are quite small, so we can store them in a difference array, if they were up to 10^9, we would need coordinate compression to avoid MLE.

Time Complexity: \mathcal O(N)


Comments

There are no comments at the moment.