"A journalists job is to report the news, if they don't feel confident in a given domain then they can sit the story out."
That may have been the case in the past, but "journalism" has radically changed within the past decade. Today it's more about telling the story that the media conglomerates want the public to hear, than about honest reporting.
Public distrust of the media is at an all-time high, and growing.
Is it really the case that a decade ago journalism was not telling the story that media conglomerates wanted the public to hear? Or did that just seem to diverge from what you personally wanted to hear about a decade ago?
I mean, I feel this. I watched George Bush get mocked and made fun of on the news for having a stutter almost every day. I also watched them do daily reports on the deaths in Afghanistan and Iraq. When Obama took over they stopped reporting deaths and some organizations cheered him on while others broadcast viscerally racist shit.
So, to answer your question: it's both. It's bullshit I don't want to hear and what a grip of powerful people probably want me to hear.
That may have been the case in the past, but "journalism" has radically changed within the past decade. Today it's more about telling the story that the media conglomerates want the public to hear, than about honest reporting.
Public distrust of the media is at an all-time high, and growing.