Is western media biased against the west?
Whenever I open Western Media I see negative news about Western countries like US, Britain etc. For example, I see videos insulting/making fun of the very president of USA, of gun violence, racism, etc. In the UK, there are videos of stabbing incidents, some religious violence (something about Hindu vs muslim or something similar). While all of these incidents may have actually happened, it is bit fascinating they are pinned up to the extent that they are.
I think that if any normal person opens the western media reporting about western countries, they will think they are somehow being depreciating but yet on most measurable indices of human development and growth, they seem to be top of the list.
Normally people say Western Media is biased negatively against countries outside the west. I intuitively feel there is some truth to this, but does there exist quantitative evidence to believe that Western Media is biased against West itself?
By western media, I mean the top news channel based in the west which together contain more than 90% of the total domestic and international news viewers.
Edit: I am sorry but how is this a bad faith post? What hidden agenda could I have?