I have come to accept that I have a foot fetish. Of the female persuasion
Is that strange? Or sick? I mean has society progressed to a point of general acceptance toward people
and their unique fetishisums??
Commercials on TV for women's sandals, the internet porn I watch, etc, I always find myself drawn to the
Should I embrace my fetish or be shunned?
I wonder if its some repressed male chauvanistic urge I have to "keep women barefoot and pregnant"