| ▲ | dash2 9 hours ago | |
So the time series are provided with no context? It's just trained on lots of sets of numbers? Then you give it a new set of numbers and it guesses the rest, again with no context? My guess as to how this would work: the machine will first guess from the data alone if this is one of the categories it has already seen/inferred (share prices, google trend cat searches etc.) Then it'll output a plausible completion for the category. That doesn't seem as if it will work well for any categories outside the training data. I would rather just use either a simple model (ARIMA or whatever) or a theoretically-informed model. But what do I know. | ||
| ▲ | Tarq0n 8 hours ago | parent [-] | |
If it works for predicting the next token in a very long stream of tokens, why not. The question is what architecture and training regimen it needs to generalize. | ||