Hollywood is the world’s leading film industry. Americans are proud of Hollywood because of its immense contribution to America’s global brand and soft power. But an often ignored aspect is how film has shaped image of democracy of the United States. Scott Simon of NPR discusses with film experts Wesley Morris and Mark Harris on the films that have influenced how Americans view democracy.
Visit this link for the audio and transcript of this insightful discussion.
Leave a Reply