▲ | simonw a day ago | |
I finally got around to trying this out right now. Here's how to run it using uvx (so you don't need to install anything first):
I took the simplest route and pasted in an OpenAI API key, then I typed:
It generated a couple of chunks of Python, asked my permission to run them, ran them and gave me a good answer.Here's the transcript: https://gist.github.com/simonw/f78a2ebd2e06b821192ec91963995... | ||
▲ | swyx 20 hours ago | parent [-] | |
simon's writeup is here https://simonwillison.net/2024/Nov/24/open-interpreter/ i always thought the potential for openinterpreter would be kind of like an "open source chatgpt desktop assistant" app with swappable llms. especially vision since that (specifically the one teased at 4o's launch https://www.youtube.com/watch?v=yJHw33cVeHo) has not yet been released by oai. they made some headway with the "o1" device that they teased.. and then canceled. instead all the demo usecases seem very trivial: "Plot AAPL and META's normalized stock prices". "Add subtitles to all videos in /videos" seems a bit more interesting but honestly trying to hack it in a "code interpreter" inline in a terminal is strictly worse than just opening up cursor for me. i'd be interested if anyone here is active users of OI and what you use it for. |