| ▲ | salawat 6 days ago |
| You build the tool, you're culpable ultimately. I've made it a rule in my life to conduct myself as if I will be held to account for everything I ultimately build, and it's externalities. Helps keep my nose cleaner. Still managed to work on some things that keep me up at night though. |
|
| ▲ | brabel 6 days ago | parent | next [-] |
| That’s absolutely not how it works. Every license has a clause explicitly saying that the user is responsible for what they do with the tool. That’s just common sense. If it was the way you suggested no one would create tools for others anymore. If you buy the screw driver I sold and kill someone with it, I sure as hell have my conscience clean. In the ChatGPT case it’s different because the “tool” has the capacity to interact and potentially manipulate people psychologically, which is the only reason it’s not a clear cut case. |
|
| ▲ | Pedro_Ribeiro 6 days ago | parent | prev | next [-] |
| That's a slippery slope! By that logic, you could argue that the creators of Tor, torrenting, Ethereum, and Tornado Cash should be held accountable for the countless vile crimes committed using their technology. |
| |
| ▲ | RyanHamilton 6 days ago | parent [-] | | Legally I think not being responsible is the right decision. Morally I would hope everyone considers if they themselves are even partially responsible. As I look round at young people today and the tablet holding, media consuming youth programmers have created in order to get rich via advertising. I wish morals would get considered more often. | | |
| ▲ | salawat 6 days ago | parent [-] | | This, right here, is why I take the stance I do. Too many ethical blank checks get written ultimately if you don't keep the moral stain in place. If you make a surveillance tool, release it to the world that didn't have that capacity, and a dictator picks it up and rolls with it, that license of yours may absolve you in a court of law, but in the eyes of Root, you birthed it. You made the problem tractable. Not all problems were meant to be so. I used to not care about it as much. The last decade though has sharply changed my views though. It may very well be a lesson only learned with sufficient time and experience. I made my choice though. There are things I will not make/make easier. I will not be complicit in knowingly forging the bindings of the future. Maybe if we mature as a society as someday, but that day is not today. |
|
|
|
| ▲ | novok 6 days ago | parent | prev [-] |
| So if you build a chair and then someone uses it to murder someone, are you responsible for the murder? |
| |
| ▲ | salawat 4 days ago | parent | next [-] | | I am not willing to grant the same level of exculpation on the basis of reasonable forseeability to a software/AI dev that I am to a carpenter making a chair. The history of our trade has in it found great joy in how to use things in ways they were never intended, which much more than a carpenter, puts a burden on us to take the propensity and possibility of abuse into account amongst ourselves. So no. Mr. Altman, and the people who made this chair, are in part responsible. You aren't a carpenter. You had a responsibility to the public to constrain this thing, and to be as ahead of it as humanly possible, and the number of AI as therapist startups I've seen in the past couple years, even just as passion projects from juniors I've trained, have been met with the same guiding wisdom: go no further. You are critically outside your depth, and you are creating a clear and evident danger to the public you are not yet mentally or sufficiently with it enough to mitigate all the risks of. If I can get there it's pretty damn obvious. | |
| ▲ | mothballed 6 days ago | parent | prev [-] | | You're not even responsible if you build an AR-15, complete with a bunch of free advertisements from the US army using the select-fire variant to slaughter innocent Iraqis, and it's used to murder someone. The government will require them to add age controls and that will be that. |
|