Remix.run Logo
norir 2 days ago

Not only do the instruments require practice to sound good (I've been playing electric bass for three years and am just beginning to sound better than bad), but a huge part of the process is learning to listen to the instrument and make adjustments. The beauty is that you can immediately hear the result of the adjustment. If it sounds better, you keep it. Otherwise you move until you get closer to what you're looking for. With a prompt based ai tool, it is not possible to make low latency adjustments. Even if you could, how would you articulate the subtle adjustment to the llm?

My sense is that contrary to marketing, ai tools will be most useful to people who already have musical skill and will actively subvert musical development in most people who rely on it too early in their process.