Hey guys, I was thinking about Anchorman 2, and I would love for my journalism students to watch a documentary at the beginning of the year talking about how the media has changed in their perception of the news from reporting to entertainment. I vaguely recall someone telling me once there was a very good one about this exact topic but I don't remember the name. Any ideas?
(As much as I love Anchorman 2, I would like to keep my job which doesn't include showing it. )