When did the term feminism first appear in France?
The label first appeared in France in 1871 within a medical thesis describing men suffering from tuberculosis who had developed feminine traits. This usage was deeply negative and intended to criticize a perceived confusion of the sexes. By 1872, Alexandre Dumas fils used the word to refer to men who supported women's rights, further cementing a derogatory tone.