I'm sorry it sounds like you've had bad experiences on this side of the world! I've been hearing horror stories about the UK in that regard for a while now. And, of course, I've heard a lot about the goings-on in my home country.
Well, let me first say, it's not as if I had any real conversation with anyone in Japan about the role of the U.S. in Japan's history. It was just after a bit of studying, I came to learn just how intertwined our histories were.
I also felt that in the post WWII era that Japan and the U.S. ranked as allies with a closeness comparable to that between the U.S. and Europe. Also, the cultural exchange between the two countries has been huge. There are Japanese products that have been staples for me my whole life, and I know that's the case for many Japanese people in regards to American products.
In the U.S. at least this has led to massive positive regard for Japan and its people. There's a significant segment of our population that hold the Japanese on a pedestal. I assumed that with cultural influence going in both directions that the feeling would be mutual.
However, when I did ask Japanese friends how they viewed America they told me they felt positively to the country. Though this may have been because they knew they were talking to an American.