I think women taking care of their health is a better guarantee of long-term beauty than makeup. The chemicals in a lot of makeup may actually end up damaging and aging skin as well as impacting women's health in general.
As for the latest trends. I am sure that it is greatly astroturfed by corporations with agendas to make women look like androgynous Men.
It is Satanic to be honest. Just my opinion.
EXACTLY. I've always viewed makeup the same way as well. The problem is that women are usually much harder on themselves than Men are on the women. But you'd think women would want to avoid the use of makeup because it ages their skin and makes their appearance look worn much earlier in life, and there's also nothing romantic about kissing a girl with lipstick on or when it leaves the mark on glasses.
Interestingly enough, I got into some heated arguments with an ex of mine over this and she thought that I had some jealousy issues where I wanted to make her look "less beautiful" in public, especially around other Men, of all things! A thought / idea that I never even could have imagined!
I think women overthink the idea of a Man criticizing even a small amount of acne or blemishes along with the natural texture of the skin, even though we don't.
When I see girls who wear makeup, it always makes me think they have something to hide, and if they prefer to hide themselves that way, I often think what other ways and things do they hide behind?
Healthy, Sane, Heterosexual Men also would naturally prefer bare faces as opposed to the makeup, because the glow and pheromones that emit from the face and arouse the senses and real attraction Men have when analyzing a woman's face, are inhibited by the use of makeup.