In America, wars have been in some country far away for a long time, meaning that the media is the only real way for us to have a sense of what is going on. What the media choses to show and tell us is very important to what we think of wars. If we were shown how gruesome every war really was, if we were shown the discarded limbs and the bleeding bodies, would we be so ready to jump in another? With the internet now allowing a source of information much less censored (in the U.S. at least) we're finally being shown the truth about the World War II, Vietnam, and Iraq. Are these graphic images we're now being shown part of the reason why we have refused to let Obama start yet another war in Syria? Or is just economic fatigue and wariness about how long this one will last. It's hard to say, but it's certainly interesting to think about.
Why did the gusto and patriotism associated with wars in the first half of the 20th century end? Going off to war is still often times seen as noble, but the nature of what soldiers have to do leaves us with an erie and uncomfortable feeling. Why aren't we allowed to see the truth of what goes on in war? Do we even really want to know? Personally, if I'm honest, I don't really want to know. However, I believe that is my responsibility to know. If I'm going to contribute in any way to the decision of whether or not we go to war, I sure want to know what I'm talking about. Democracy only works if the public is informed, it's one of the first things you learn in Government classes. Yet it can be hard to wade through media propaganda, whether it be from the war hawks or the peacekeepers.
Biases aside, media is essentially the only real way to know about what's going on in war, which is fueled by public opinion. This makes it one of the most powerful forces and tools for control by a government.