▲ | rootnod3 3 days ago | |
The longer the context and the discussion goes on, the more it can get confused, especially if you have to refine the conversation or code you are building on. Remember, in its core it's basically a text prediction engine. So the more varying context there is, the more likely it is to make a mess of it. Short context: conversion leaves the context window and it loses context. Long context: it can mess with the model. So the trick is to strike a balance. But if it's an online models, you have fuck all to control. If it's a local model, you have some say in the parameters. |