If it's deltatimed, how is it dependent on the framerate?
Lerp given an input of values a ,b and t will return a value v that is t percent between them; If you fed it with 0, 4 and 0.5 it would be equal to 2, because 2 is the number equal to 50% between 0 and 4. A lot of people who are new to game development in Unity will use the frame delta as the t value, leading to an exponential slowdown as the value reaches the destination b - it looks correct, but is isn't, and the results will vary depending on framerate.
If you are running at 100fps, the delta time will be equal to 0.01 and the function
Lerp(a, b, DeltaTime); will return a value that is 0.99a+0.01b.
Likewise, running the game at 50fps and delta time of 0.17, the same function will return a value of 0.83a+0.17b.
This is not the proper scalar for keeping the game framerate independent, if the game is running at 100fps and the output of a function is equal to v, then running the game at 50fps should result in that function outputting a value equal to 2v in order to compensate for running two times slower, however you can clearly see that this is not the case (0.99a+0.01b =/= 0.5[0.83a+0.17b]).
Furthermore, using the Lerp function as described above will never actually get your value to be equal to b, unless your game is running so slowly that the frame delta is >= 1, which will most likely never happen, even in Yandere Simulator's case.
There's a plethora on articles and forum posts about this online, all of them are aimed at beginner developers - Alex obviously didn't investigate either.