Remember to use this editorial only when stuck, and not to copy-paste code from it. Please be respectful to the problem author and editorialist.
Submitting an official solution before solving the problem yourself is a bannable offence.
Author: magicalsoup
This is a simpler version of the classic task-scheduling problem.
The original problem is very similar, given
tasks, each with a start time
and a finish time
, what is the minimum amount of machines needed to accomplish all
tasks, if each machine can only do
task at a time.
The solution is to first sort the tasks by their start time, which this problem has already provided. Then we just need to find the time with the most overlaps.
To find the most overlaps, think of an interval
, as adding a
(task) throughout the interval. Then this resembles the difference array update problem. We can use a difference array to keep track of how many overlaps we have, by simply adding one at
and subtracting one at
. Notice how it is
and not
, this is because a task only occupies the range
. Then simply run prefix sum, then find the max value in the difference array, that will be the maximum number of overlaps, and the minimum amount of machines needed.
Also keep in mind that the ranges
and
are quite small, so we can store them in a difference array, if they were up to
, we would need coordinate compression to avoid MLE.
Time Complexity: 
Comments