For context, I’m circumcised and expecting a son and my wife and I are torn about the circ. We’re American so from a cultural standpoint circumcision is the default choice. Thing is, there’s no real benefit besides practicing a religion we don’t believe in, and I’m uncomfortable about cutting the tip of my son’s dick off.

On the other side, I’ve met a guy who was bullied in high school so bad for it he got a circ as an adult. Apparently crazy painful recovery. I’ve also talked to women who are generally grossed out by uncircumcised men. I don’t want to make him feel like something’s wrong with him his whole life because I was uncomfortable with the idea.

From a moral standpoint I’m against it, but from a social and cultural standpoint I feel like I should do it? It’s a crappy situation. If there’s any uncircumcised American men who want to talk about their penis I’m all ears.