Remix.run Logo
mgdev 16 hours ago

This is an economically sound conclusion.

It also means that you need to extract enough value to cover the cost of said tokens, or reduce the economic benefit of finding exploits.

Reducing economic benefit largely comes down to reducing distribution (breadth) and reducing system privilege (depth).

One way to reduce distribution is to, raise the price.

Another is to make a worse product.

Naturally, less valuable software is not a desirable outcome. So either you reduce the cost of keeping open (by making closed), or increase the price to cover the cost of keeping open (which, again, also decreases distribution).

The economics of software are going to massively reconfigure in the coming years, open source most of all.

I suspect we'll see more 'open spec' software, with actual source generated on-demand (or near to it) by models. Then all the security and governance will happen at the model layer.

cassianoleal 14 hours ago | parent | next [-]

> I suspect we'll see more 'open spec' software, with actual source generated on-demand (or near to it) by models. Then all the security and governance will happen at the model layer.

So each time you roll the dice you gamble on getting a fresh set of 0-days? I don't get why anyone would want this.

mgdev 14 hours ago | parent [-]

You already do this with human-authored code, just slowly.

Project model capabilities out a few years. Even if you only assume linear improvement at some point your risk-adjusted outcome lines cross each other and this becomes the preferred way of authoring code - code nobody but you ever sees.

Most enterprises already HATE adopting open source. They only do it because the economic benefit of free reuse has traditionally outweighed the risks.

If you need a parallel: we already do this today for JIT compilers. Everything is just getting pushed down a layer.

xigoi an hour ago | parent | prev [-]

I love using software that changes every time you compile it.