American nationalism is a somewhat ambiguous term. Theoretically, the word "American" could refer to the Americas. Such a sense is in practice almost never the intended one. Instead, the word refers to the United States of America.
Previously, when Whites were the overwhelming majority in the United States, "American nationalism" was close in meaning to White nationalism for United States Whites. This was a very widespread, mainstream view.
Thus, originally, the voting rights in the different states were typically restricted to non-poor White men and the Naturalization Act of 1790 limited giving United States citizenship to immigrants to only White persons of good character.
"American nationalism" may also be used to refer to the more recent, no longer mainstream, White nationalist movement.