I.e. people who look at f(now) and assume it'll be like this forever against people who look at f'(now) and assume it'll improve like this forever
What is f''(now) looking like?
Very small. There’s very little fundamental research into AI compared to neural networks from what I can tell