Western media are independent in the sense that they tend to be corporate-owned, but when was the last time a major American TV channel disagreed with the government on a prominent foreign policy issue?
Mainstream US media tend to reflexively back any American war of agression. They peddled US government lies about Iraqi WMD, and as I recall they never acknowledged their responsibility for deceiveng the public or apologised.