Submit solution
Points:
15
Time limit:
1.0s
Memory limit:
256M
Problem types
Allowed languages
Ada, Assembly, Awk, Brain****, C, C#, C++, COBOL, CommonLisp, D, Dart, F#, Forth, Fortran, Go, Groovy, Haskell, Intercal, Java, JS, Kotlin, Lisp, Lua, Nim, ObjC, OCaml, Octave, Pascal, Perl, PHP, Pike, Prolog, Python, Racket, Ruby, Rust, Scala, Scheme, Sed, Swift, TCL, Text, Turing, VB, Zig
You have an array of integers, initialized all to zero to begin with. Support two operations.
INCREMENT l r a
- For each index between
and
, increase the
th element of the array by
.
SUM l r
- Compute the sum of the integers between indices and
, inclusive.
Constraints
Input Specification
The first line contains two positive integers, and
.
The next lines each contain a sequence of positive integers. If the first integer in the line is
, then three integers follow,
,
, and
, indicating an
INCREMENT
operation.
Otherwise, the first integer in the line is , and then two integers follow,
and
, indicating a
SUM
operation.
Output Specification
For each SUM
operation, output on its own line the result of the query. The data will be set such that this fits in a signed 32-bit integer.
Sample Input
3 4
1 2 3 2
2 1 1
2 2 2
2 3 3
Sample Output
0
2
4
Comments
Can anyone tell me why I'm TLEing on testcase 5? I know there's some sort of optimization but I don't know the specifics.
Your time complexity is currently
, which is insufficient given the constraints (this is on the order of
operations, give or take). If you are stumped, you can try reading the tutorial :)).