▲ | ngrislain 6 months ago | |
Same token usage. Actually OpenAI returns the logprob of each token conditional on the previous ones with the option logprobs=true. This lib simply parses the output json string with `lark` into an AST with value nodes. The value nodes are mapped back to a range of characters in the json string. Then the characters are mapped back to the GPT tokens overlapping the character ranges and the logprobs of the tokens are summed. | ||
▲ | potatoman22 6 months ago | parent [-] | |
That's great to hear, thanks for the explanation! Super excited to try this out. |