How is it that we’ve all been taught about sex since school, but somehow, no one really talks about the whole “we’re in this together” thing—do we just wing it and hope for the best?
How is it that we’ve all been taught about sex since school, but somehow, no one really talks about the whole “we’re in this together” thing—do we just wing it and hope for the best?
Share