Look it up.

"The term "American Empire" refers to the United States' cultural ideologies and foreign policy strategies. The term is most commonly used to describe the U.S.'s status since the 20th century, but it can also be applied to the United States' world standing before the rise of nationalism in the 20th century."

Americans dislike being tagged as an imperialist power. I wouldn't either.