I've often heard it said that Americans are uncomfortable with sex, and that this is seen in the fact that it is often forbidden to depict sexuality or nudity in popular media, yet depictions of graphic violence are ubiquitous. Implicit in this observation is that depictions of violence should rightly seem as bad, or worse, than depictions of sex. But what makes any such depiction bad? Is it just a matter of the psychological distress they cause? Is it that they encourage people to do what they depict? Are some things just intrinsically obscene?
Read another response by Jonathan Westphal
Read another response about Sex