Doesn’t Christianity promote toxic masculinity and denigrate women?

Doesn’t Christianity promote toxic masculinity and denigrate women?