Google AI Ships TimesFM-2.5: Smaller, Longer-Context Foundation Model That Now Leads GIFT-Eval (Zero-Shot Forecasting)
Google Research has launched TimesFM-2.5, a 200M-parameter, decoder-only time-series basis mannequin with a 16K context size and native probabilistic forecasting help. The new checkpoint is stay on Hugging Face. On GIFT-Eval, TimesFM-2.5 now tops the leaderboard throughout accuracy metrics (MASE, CRPS) amongst zero-shot basis fashions. What is Time-Series Forecasting? Time-series forecasting is the apply of…
