I've been wanting to post this thread for several months but wanted to do as much online research as possible before bringing it up. However I've largely arrived with nothing definite. The last few years a common refrain has been that "Americans don't like the police anymore" and "The police aren't popular in the U.S. anymore". When (as precisely as possible) and for what reasons did this change in public attitudes occur? I thought about the Trayvon Martin shooting and subsequent investigation and exonerations of George Zimmerman. But that didn't involve law enforcement authorities in the actual shooting whatsoever. I thought about the Ferguson, Missouri shooting (the "hands up don't shoot") incident where an officer killed a young black man. But IIRC an exhaustive investigation cleared the officer and found the officer to be justified in using deadly force. Some have suggested that the overall decline in crime in the U.S. has made the police and the tactics they use seem far, far less necessary and justified than they did from the 1970s and 80s when street crime exploded. In other words, law enforcement is a victim of its own success. Thoughts?