This might not prevail in the world of tech, but in language studies, words mean what the majority of their users think they mean. Examples:
* Decimated. How many of you know this means (or once meant) reduced by 1/10?
* Literally. Often used to mean figuratively, to the degree that it can be relied on to mean nothing at all.
* Reign, as in "reign him in". Clearly now an accepted misuse, reign once defined what a monarch does to a kingdom, not what a cowboy does to a horse (i.e. rein).
* Fewer / less. Sadly interchangeable in modern writing, "fewer" was once reserved for enumerable things, while "less" referred to continuous measures. Less water, fewer liters of water.
* Double precision. In computer science, defined in IEEE 754 as a floating-point data format with a 53-bit mantissa, therefore 15.95 decimal digits (53 * log(2)/log(10)). Now the norm, the default, to the degree that people may forget what "double" refers to. Because of double's ubiquity, in the fullness of time I expect single precision will come to be known as ... wait for it ... half precision.
Lexicographers are at pains to point out that words mean what people think they mean. I think they have a point.