We studied Agression in Psychology and briefly touched on agression in the media and it's effects on people, and what struck me as interesting was that American studies tended to point to violent movies causing people to act agressively, whilst UK research tended shows the opposite (agressive movies having little or no effect). I don't know if our text book and the documentry we watched to accompany it, was being deliberately biased in the research it selected but I did think it was odd that the research was so polarised.
And on top of that, some of them were pretty poor studies on both sides.
you see what I mean those studies are worthless, I put no stock in them.