It's understanding impact before taking action, especially on issues at scale (like labor displacement/replacement technologies). It's being vocal about shaping society in ways that reduce harms that naturally occur from inventions and technological revolutions instead of pawning it off to "other experts" or the workers themselves. It's engaging stakeholders beyond your comfort zone and social circles to build consensus, shape movements, and mitigate damage done while amplifying the good that would come from such a profound change. It's slowing or stopping work if society refuses to adapt, or targeting your output to harm those blocking the transformations needed to protect the commoners and guarantee prosperity is shared equally instead of hoarded. It's ostracizing those who choose to pursue such selfish ends while remaining willfully ignorant of (or worse, deliberately working towards) the harms they'll cause to others, applying negative pressure to influence positive reforms or warning others of the harm they'll incur by engaging in such socially-hostile behaviors.
At its core, it's about understanding (the metaphorical) you are not a single person but a cog in a much larger machine, and that your actions reverberate throughout that machine in ways that are largely predictable, at least for a reasonable number of next-order impacts. Putting aside emotional intelligence like empathy and compassion for a bit, the practical intelligence needed to solve these tasks at a scale where robots or machines can displace labor implies a similar capacity to understand the harms and impacts of said solutions on a populace. To focus solely on solving the problem before you rather than acknowledging the impact it will have beyond you is to willfully reject accountability in favor of achievement, and we have enough vainglorious chuds in technology as-is.