What’s one truth about life that women often notice, but people don’t want to admit?
What’s one truth about life that women often notice, but people don’t want to admit?
r/AskWomen
What’s one truth about life that women often notice, but people don’t want to admit?