Copyright © University of Cambridge. All rights reserved.
'Ramping it Up' printed from https://nrich.maths.org/
The gradient is infinitely steep at the points where there is a
step in value. These arrows are called delta functions. They are
thought of as infinitely high and thin, and have a notional area
equal to the value of the step jump. Thus when you integrate this
derivative, you get back to the original step function.
A system's (for example a car suspension) response to an impulse
(you might think of it as a sudden "bang") is very useful, and can
be used on a process called "convolution" to find a system's
response to any input.