Entropy, when interpreted as “disorder”, is generally bad. Entropy naturally tends to increase, by the second law of thermodynamics. Over time, your room will get messy, and you must invest energy to clean it up. Your refrigerator keeps cold air and warm air separate. But without a power source, that “organization” is lost, and everything inside and out becomes uniformly room temperature.
However, I think there is an interesting duality to the word—whether entropy is good or bad depends on your frame of reference.
In maintaining a software codebase, it is desirable to eliminate redundancy, as captured by the DRY principle. Duplicate data should be consolidated by database normalization. And when code is considered as data, we may deduplicate it by refactoring.
In an information theoretic sense, this is actually maximizing entropy. Normalizing data, stripping redundancy—these serve to maximize the amount of information gained per character for a developer browsing the code. One may speak of improving code quality or hygiene in this way as “organizing” the codebase. And now there seems to be a paradoxical relationship between the concepts of “organization” and “entropy”: organizing one’s room is desirable, but decreases entropy. Organizing the code is good, but increases entropy.
I haven’t yet been able to articulate the underlying rule that unifies the thermodynamic and information-theoretic senses of the word. Someone else probably has. For now, “entropy warrior” is a sufficiently ambiguous term such that I can claim to fight for either side.