Entropy is a measure of the probability of finding a system in a given configuration - period. You might have seen it in your statistical physics or thermodynamics textbooks as:
S\proptolog(g), where the g is the multiplicity of the system's configuration, given by the binomial distribution...
A reversible process is one in which system+surroundings (the universe) gains no net entropy. But one can see an entropy increase at the expense of the other while observing this relationship: only if the entropy increase in one is equal to the entropy loss in the other.
A reversible process...
That's right.
Following Kittel, the multiplicity for a system of N particles, each of which with only two states available ("up" or "down" spin) is simply
g=\frac{N!}{N_{up}!N_{down}!}
If I add one particle (say spin up), we have:
g=\frac{(N+1)!}{(N+1)_{up}!N_{down}!}
Since...
I've noticed a terrific number of authors talk about "entropy transfer" across a system boundary.
But entropy is defined as log(multiplicity), and is a measure of available states to a system in a given configuration. We can transfer mass, charge, energy, from one system to another, and thus...