Interesting

Why didn't the USA move to the left after WWII?

Why didn't the USA move to the left after WWII?


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

After the second world war, and after the full impact of what the Nazis had done became clear, it seems like international politics moved to the left: in the east and far-east, support for communist movements rapidly rose; in the west, Socialist, Labour, and Trade-Unionist parties saw a sudden surge in support. The largest swing in UK electoral history, for example, was to the Labour party in the 1945 general election.

The US doesn't seem to have experienced this. Instead, if anything, it lurched to the right.

The Nazis are and were demonised as much in the US as in any of the other 2nd world war allies. Why didn't they experience the same political shift away from that direction as most other involved countries did.


It's probably too simplistic to say "the rest of the world jumped to the Left". That's not really the case. It's also pretty simplistic to say the U.S. "lurched to the right" - what does this mean in practical terms? The right wing/left wing paradigm is in of itself a simplistic paradigm that often obfuscates more than it illuminates. It also ignores the domestic political contexts in these nations that supposedly jumped to the left. I'm sure people had their own reasons for their voting intentions.

What we can say is that there are a couple of reasons why the Americans have traditionally favoured liberal capitalistic politics.

  1. Cultural: The Americans were already pretty capitalistic; this was embodied in their early history as a nation and the laissez-faire period of the late 1800s & their frontier culture.
  2. Strategic competition with Russia: The threat of the Soviet invasion of Western Europe would have represented a strategic imperative to refute the political ideology of socialism. This would also have been exacerbated by Mao's success in the Chinese civil war.
  3. Labour market forces: The United States has always been a capitalistic country and the more powerful commercial entities would have considered it in their interests to use the threat of communism to degrade the threat of militant unionism.
  4. Humans rights abuses: The abuses committed under Stalin and earlier during the Russian civil war would probably have been somewhat known to at least some in the American electorate.
  5. Consumerism: was a wild success in the the 50s period and would have presented an excellent ideological alternative to the austerity of socialism. It was in fact, superior. Simply judging by the results of the two systems, the American liberal capitalism was superior to the Russian socialist system.
  6. Economics: The massive mobilisation of American industry for WW2 created an industrial behemoth of the U.S. The Great Depression would be a fading memory by this time and the ideals of the American dream - individual prosperity created a new norm and new aspirations for the individual American.

In short there was not a single root cause for this but a number of existing and emerging factors that would have contributed to voter intentions. But did the United States really lurch to the right? What is "right" in this context? By what yardstick do we measure "rightness"? This is the problem with the right-wing left-wing paradigm - it doesn't explain a lot and tends to obfuscate more complicated political realities.


One reason was that America was the biggest winner of World War II. It started the war with about 40% of the world's industrial capacity (according to Paul Kennedy, the Rise and Fall of the Great Powers"), and ended with about "half" of the capacity of a war-torn world. Other countries that were on the winning side, Britain, the Soviet Union, and China, were worse off than before the war. The war weariness was most felt in Britain, where voters threw out war leader Churchill. Many people in Britain went "left" because they did not feel that the sacrifices were worth it, or that they had gotten their fair share of the spoils.

Returning American "GI" veterans were the most empowered men in the world. The GI Bill gave men from working class families the means to attend college, and later staff the managerial ranks of America's burgeoning industrial complex. Those that retained "blue" collars were still war heroes, and treated as such by unions, and managements. American workers had fewer grievances than those elsewhere, and were less likely to sympathize with "leftists."

In 1945, there were three basic political groups: liberal Democrats, conservative democrats, and (right-leaning) Republicans. Taken as a "body," the American public was quite conservative. President Franklin Delano Roosevelt was admired for his "competence," but he was actually placed in power by a conservative Texan, John Nance Garner. After FDR tried to replace Garner as Vice-President with his leftist "soul mate," Henry Wallace in 1940, the Democratic Party balked in 1944 and insisted on the more conservative Harry Truman, who (in)famously said, "“If we see that Germany is winning we ought to help Russia and if Russia is winning we ought to help Germany, and that way let them kill as many as possible, although I don't want to see Hitler victorious under any circumstances.” After he became President, Truman allied with (West) Germany and started the Cold War against the Russians. This helped lead to the "villification of "left wingers" in the United States.

"The Nazis are and were demonised as much in the US" as elsewhere. That's true only up to a point. It is important to note that through the 1950s, some people in America advocated racial policies against African-Americans that were actually "milder" versions of Nazi racial policies. Some authorities believe that Naziism was partly inspired by the American eugenics movement of the 1920s.


In occupied western Europe, underground communist parties took a large role in anti-nazi resistance, for which they were electorally rewarded. Not being occupied, the war experience in the UK and the USA was very different. Experiences in eastern Europe, where nazi terror was replaced by stalinist oppression, was different yet.

Three days from now (on 25 February), the February strike will be exactly 75 years ago. This was one of the major events in Nazi-occupied Netherlands. The strike was initiated by the underground Communist party, aiming to stop the racism against the Jews. Although it didn't last long and didn't save any Jews, it did cause major credit to the Communist party and their best-ever election result in the first post-war elections, when they received over 10% nationally (1937: 3.4%) and over 30% in Amsterdam, the centre of the strike and other communist-led resistance. Their newspaper (De Waarheid, or Truth) was very briefly the largest newspaper in The Netherlands. The support dwindled quickly when the communist party treated harshly (such as through character assassination) anyone not following the line determined centrally (from Moscow), including communists considered war resistance heroes by many. Not much support was left after the Communist Party supported the Soviet invasion of Hungary after the Hungarian revolution, when communist support in national elections fell to 2.4%. It didn't take any McCarthyism to severely damage the Party - the Dutch communists did it all to themselves/with "help" from Moscow¹.

Communists were very active in other elements of the resistance as well, in The Netherlands, France, and elsewhere.


¹The Dutch secret service did to its part by encouraging splits within the Communist party. To what extent they contributed to the demise of the Communist party remains up for debate.


Compared to when?

If you look at the USA politically in 1946 vs 1939 you see a very different shift than if you were to compare 1946 to 1926.

The Great Depression and the resulting New deal did push the USA further "left" (if you can call it that).

The thing to notice is that in this case, there was a clear compelling reason to do so - 25% of Americans were out of work in various portions of the 30s. There were clear, compelling reasons in the American mainland which drove this "jut to the left."

Contrast this to post WWII. America by and large saw economy, political, and cultural aspects work in WWII and lead America to victory. While the factors you see are true, the average American in fall 1945 would have been far more likely to see and value the success of the American system - and see strength and value in leaving it untouched.

There just was not a compelling reason for the average American to be motivated to "push America left" as you are asking about.


Your question operates from a very black and white view of left/right, and your mistaken assumption is that the Nazis were of the political right.

The Nazis were Nationalist Socialists and often leaned left. Their fight against the Soviets weren't because they were opposed to Communism. Remember that Hitler allied with Stalin and then double crossed him so there wasn't a lot of ideology going on there.

"Liberal Facism" by Jonah Goldberg outlines in great detail how the Nazis had more in common with traditional leftism than American conservatism.


In order to discuss this sensibly, it's necessary to distinguish the social-democrat left from the actual Communist left. "Fabian" socialism as opposed to revolutionary Marxism. The former pursued economic redistribution, the establishment of public healthcare and education systems, and nationalisation of some industries while generally leaving private property alone and retaining elections. The latter didn't, and were often thoroughly infiltrated by actual Stalinists and KGB agents. Although not as thoroughly as Joe McCarthy would have you believe.

Another key point is that European countries became command economies (or at least heavily requisitioned economies) as soon as the war started. Non-state economic activity became very difficult due to lack of resources and manpower, while a huge state economy was built to produce war materiel and ration everything else.

Let's also not understate how much was destroyed, and how many people were killed, exiled, expropriated, wounded or seriously inconvenienced by the war. It was a war of indiscriminate destruction from the air. A significant fraction of Europe's remaining traditional hereditary aristocracy were killed, in some cases wiping out entire family lines.

The existing social and economic order was simply blown to pieces, a lot needed to be (re)built, and everyone was already mobilised. It's a short step from state-directed building of aircraft, hospitals and barracks to state-directed building of cars, hospitals and houses.

America suffered no such destruction of property, providing less of an opportunity to redistribute its replacement. Meanwhile the foundations for post-war technological industries were being built around the arms manufacturers, and the post-war space race.

There were a lot of public left/right confrontations in various countries in the 60s. Everything from the US civil rights movement to the soixante-huitards to the Greek internal conflict which collapsed into a military dictatorship. Italy could have gone either way (and the CIA were involved there, in Operation Gladio). Also de-colonialisation by France and the UK; arguably this is a shift to the "left".

It's also important to look at how pivotal individual figures were and how differently things could have gone if, say, JFK and MLK hadn't been assasinated. The US could have ended up not so far to the right.

But ultimately a lot of the US "rightism" was straightforward power politics of anti-communism, competing against the Soviet Union. This included sponsoring terrorism and coups in South America, the Vietnam War, and so on.

Edit: this is a huge question, really. How much of "Europe" are you counting? France+Benelux+West Germany+Scandinavia+Italy count as "left", I suppose, but what about military dictatorships in Spain+Portugal+Greece?

About bomb damage in the UK: while this only affected a small fraction of buildings, it was an omnipresent threat in any urban area. The British victory is seen in popular history as fundamentally collective - "pulling together", "blitz spirit", Dunkirk's "little ships", rationing, etc. (Slightly contradicted by talking about "the few" of the air superiority fighters and the aristocratic flavour of the RAF, though).

America lacks universal health care partly because of a persistent popular belieft that ill-health reflects immorality, while the UK set its up at a time (1944) when anyone could be injured by shrapnel at any time without it being a reflection on their character.


This is a great question, and I've decided to write my own answer since the existing ones fail to mention a few facts.

  1. Reason #1 should be J. Edgar Hoover. Even before Truman's creating the 'national security state', the FBI succeeded in infiltrating the communist party and most left-wing organizations. Out of each four communist party card-carrying members, three were FBI informants. The 'left' field was completely and utterly compromised.

  2. The unions, unlike their European counterparts, were averse to communist propaganda. Some union leaders used to be 'fellow travelers' in the 1930s but the purges, the Winter War, and the Molotow-Ribbentrop Pact quickly changed their minds (cf. the case of Walter Reuther who was instrumental in getting rid of pink colors in the UAW membership). The unions were quite influential, and members of the New Deal coalition.

  3. Harry Truman gathered various conservative, wealthy, East Coast-educated, Georgetown-dining gentlemen in the national security apparatus. This machine worked efficiently to protect itself and the American state from subversion, i.a. by allying with and indirectly controlling mainstream media outlets (see the Alsop brothers, for instance). It also expended much effort and money to subvert European communist parties, to set up 'stay behind' shadow state, to train law enforcement and special services there…

  4. Stalin had his hands full and naturally devoted more attention to the events in Europe.

  5. The United States quickly climbed out of shallow post-war economic depression and surged forward. Whatever rationing there was was quickly abolished. A full belly would not harbor treasonous thoughts.


The main difference between the totalitarian states in the 1900s and the democracies was not at all in economic policy but in to what extent and to what detail the (federal) state was allowed to rule over citizens (private) lives by using force.

The bipolar left-right economical spectrum did not at all take authoritarian / personal liberty into account.

"Extreme left" and "extreme right" did not mean extremely much to the left or extremely much to the right. It meant "economically left But with extremely little personal freedom" and "economically right But with extremely little personal freedom".


The share of government spending in GDP is mainly correlated with growth of the service sector and urbanization. In agrarian societies, people are independent and can live off the land. As countries develop people become more dependent on government.

In the US urbanization was a bit slower, hence, the slower spending growth.


I wasn't able to read all of the comments above - but am sure someone has mentioned the dichotomy between statism and anarchy/libertarianism. Global governments desired more control post-WWII (including the United States) but the people were oftentimes more inclined to give it in affluent Europe or nations undergoing an industrial revolution such as those in Central or South America.

Many in the U.S., at least identify with the idea of limited government more than their European counterparts - certainly in regards to economics. We've seen this play out significantly over the past 50 years.


There was a lot of left in America during those years. I'm skipping some countries, but I'm sure you can read something about the left in America over here and here.

Edit: Question title changed, answer no more valid. Thanks for correcting!


A 'Forgotten History' Of How The U.S. Government Segregated America

Federal housing policies created after the Depression ensured that African-Americans and other people of color were left out of the new suburban communities — and pushed instead into urban housing projects, such as Detroit's Brewster-Douglass towers. Paul Sancya/AP hide caption

Federal housing policies created after the Depression ensured that African-Americans and other people of color were left out of the new suburban communities — and pushed instead into urban housing projects, such as Detroit's Brewster-Douglass towers.

In 1933, faced with a housing shortage, the federal government began a program explicitly designed to increase — and segregate — America's housing stock. Author Richard Rothstein says the housing programs begun under the New Deal were tantamount to a "state-sponsored system of segregation."

Historian Says Don't 'Sanitize' How Our Government Created Ghettos

The government's efforts were "primarily designed to provide housing to white, middle-class, lower-middle-class families," he says. African-Americans and other people of color were left out of the new suburban communities — and pushed instead into urban housing projects.

Rothstein's new book, The Color of Law, examines the local, state and federal housing policies that mandated segregation. He notes that t he Federal Housing Administration, which was established in 1934, furthered the segregation efforts by refusing to insure mortgages in and near African-American neighborhoods — a policy known as "redlining." At the same time, the FHA was subsidizing builders who were mass-producing entire subdivisions for whites — with the requirement that none of the homes be sold to African-Americans.

Code Switch

Everyone Pays A Hefty Price For Segregation, Study Says

Rothstein says these decades-old housing policies have had a lasting effect on American society. "The segregation of our metropolitan areas today leads . to stagnant inequality, because families are much less able to be upwardly mobile when they're living in segregated neighborhoods where opportunity is absent," he says. "If we want greater equality in this society, if we want a lowering of the hostility between police and young African-American men, we need to take steps to desegregate."

Interview Highlights

On how the Federal Housing Administration justified discrimination

A Forgotten History of How Our Government Segregated America

Buy Featured Book

Your purchase helps support NPR programming. How?

The Federal Housing Administration's justification was that if African-Americans bought homes in these suburbs, or even if they bought homes near these suburbs, the property values of the homes they were insuring, the white homes they were insuring, would decline. And therefore their loans would be at risk.

There was no basis for this claim on the part of the Federal Housing Administration. In fact, when African-Americans tried to buy homes in all-white neighborhoods or in mostly white neighborhoods, property values rose because African-Americans were more willing to pay more for properties than whites were, simply because their housing supply was so restricted and they had so many fewer choices. So the rationale that the Federal Housing Administration used was never based on any kind of study. It was never based on any reality.

On how federal agencies used redlining to segregate African-Americans

The term "redlining" . comes from the development by the New Deal, by the federal government of maps of every metropolitan area in the country. And those maps were color-coded by first the Home Owners Loan Corp. and then the Federal Housing Administration and then adopted by the Veterans Administration, and these color codes were designed to indicate where it was safe to insure mortgages. And anywhere where African-Americans lived, anywhere where African-Americans lived nearby were colored red to indicate to appraisers that these neighborhoods were too risky to insure mortgages.

On the FHA manual that explicitly laid out segregationist policies

The Two-Way

Interactive Redlining Map Zooms In On America's History Of Discrimination

It was in something called the Underwriting Manual of the Federal Housing Administration, which said that "incompatible racial groups should not be permitted to live in the same communities." Meaning that loans to African-Americans could not be insured.

In one development . in Detroit . the FHA would not go ahead, during World War II, with this development unless the developer built a 6-foot-high wall, cement wall, separating his development from a nearby African-American neighborhood to make sure that no African-Americans could even walk into that neighborhood.

The Underwriting Manual of the Federal Housing Administration recommended that highways be a good way to separate African-American from white neighborhoods. So this was not a matter of law, it was a matter of government regulation, but it also wasn't hidden, so it can't be claimed that this was some kind of "de facto" situation. Regulations that are written in law and published . in the Underwriting Manual are as much a de jure unconstitutional expression of government policy as something written in law.

On the long-term effects of African-Americans being prohibited from buying homes in suburbs and building equity

Today African-American incomes on average are about 60 percent of average white incomes. But African-American wealth is about 5 percent of white wealth. Most middle-class families in this country gain their wealth from the equity they have in their homes. So this enormous difference between a 60 percent income ratio and a 5 percent wealth ratio is almost entirely attributable to federal housing policy implemented through the 20th century.

African-American families that were prohibited from buying homes in the suburbs in the 1940s and '50s and even into the '60s, by the Federal Housing Administration, gained none of the equity appreciation that whites gained. So . the Daly City development south of San Francisco or Levittown or any of the others in between across the country, those homes in the late 1940s and 1950s sold for about twice national median income. They were affordable to working-class families with an FHA or VA mortgage. African-Americans were equally able to afford those homes as whites but were prohibited from buying them. Today those homes sell for $300,000 [or] $400,000 at the minimum, six, eight times national median income. .

So in 1968 we passed the Fair Housing Act that said, in effect, "OK, African-Americans, you're now free to buy homes in Daly City or Levittown" . but it's an empty promise because those homes are no longer affordable to the families that could've afforded them when whites were buying into those suburbs and gaining the equity and the wealth that followed from that.

NPR Ed

How The Systemic Segregation Of Schools Is Maintained By 'Individual Choices'

The white families sent their children to college with their home equities they were able to take care of their parents in old age and not depend on their children. They're able to bequeath wealth to their children. None of those advantages accrued to African-Americans, who for the most part were prohibited from buying homes in those suburbs.

On how housing projects went from being for white middle- and lower-middle-class families to being predominantly black and poor

Public housing began in this country for civilians during the New Deal and it was an attempt to address a housing shortage it wasn't a welfare program for poor people. During the Depression, no housing construction was going on. Middle-class families, working-class families were losing their homes during the Depression when they became unemployed and so there were many unemployed middle-class, working-class white families and this was the constituency that the federal government was most interested in. And so the federal government began a program of building public housing for whites only in cities across the country. The liberal instinct of some Roosevelt administration officials led them to build some projects for African-Americans as well, but they were always separate projects they were not integrated. .

The white projects had large numbers of vacancies black projects had long waiting lists. Eventually it became so conspicuous that the public housing authorities in the federal government opened up the white-designated projects to African-Americans, and they filled with African-Americans. At the same time, industry was leaving the cities, African-Americans were becoming poorer in those areas, the projects became projects for poor people, not for working-class people. They became subsidized, they hadn't been subsidized before. . And so they became vertical slums that we came to associate with public housing. .

The vacancies in the white projects were created primarily by the Federal Housing Administration program to suburbanize America, and the Federal Housing Administration subsidized mass production builders to create subdivisions that were "white-only" and they subsidized the families who were living in the white housing projects as well as whites who were living elsewhere in the central city to move out of the central cities and into these white-only suburbs. So it was the Federal Housing Administration that depopulated public housing of white families, while the public housing authorities were charged with the responsibility of housing African-Americans who were increasingly too poor to pay the full cost of their rent.

Radio producers Sam Briger and Thea Chaloner and Web producers Bridget Bentz and Molly Seavy-Nesper contributed to this story.


Why did the United States and the Soviet Union mistrust each other after World War II?

Return to status quo. Sovietization of Eastern Europe, USSR-China alliance.

Explanation:

The Soviet Union's proclaimed goal was worldwide communism. Due to this, there had been no trust from the start between the two countries. The WWII was a period of untypical cooperation between them. Once the common goal of crushing the Nazi Germany was achieved, the relationship returned to the normal state. Even during the WWII, the level of trust was limited.

Right after the WWII, the USSR embarked upon the sovietization of the European regions under its occupation. Despite promising to hold fair elections in these countries, the USSR set up puppet regimes. The US feared further encroachment of the USSR and expansion of the "red zone". The alliance of the USSR and the Communist China made the "red zone" downright horrifying in size: it stretched from Berlin to Shanghai.

On their part, the Soviet rulers were constantly afraid of the possible encroachment of "western influence" among the population (the influence of consumerism, democratic values, free press, liberalism, western art, music, cinema, just about anything). Their very rule depended on constant anti-Western scaremongering.

There has been a return to this practice in the recent years in Russia, but still we are permeated by Western media etc. In the USSR, to allow such a level of Western penetration would've been suicidal to the regime, because it wouldd have shattered too many sensitive lies. Because of that, "trusting" the West was out of the question.

If in the US it was allowable to express positive feelings about the USSR and the Communist Party, in the USSR the western states were seen as 'ideologial enemies' and it was not until the arrival of Nikita Khruschev that the doctrine of "Peaceful Coexistence" was proclaimed. Even after that, it was not until about 1989 that a Soviet politician could pursue a pro-Western policy or express pro-Western views and remain in his seat (or stay alive, before 1953).


The Limits of New Deal Reform

Despite the growing support from black voters, President Franklin D. Roosevelt remained aloof and ambivalent about black civil rights. His economic policies depended on the support of southern congressional leaders, and FDR refused to risk that support by challenging segregation in the South. During Roosevelt’s first term, the administration focused squarely on mitigating the economic travails of the Depression. This required a close working relationship with Congresses dominated by racially conservative southern Democrats, including several Speakers and most of the chairmen of key committees. “Economic reconstruction took precedence over all other concerns,” observed historian Harvard Sitkoff. “Congress held the power of the purse, and the South held power in Congress.” 43

/tiles/non-collection/b/baic_cont_3_anti-lynching_protest_1927_LC-USZ62-110578.xml Image courtesy of the Library of Congress Members of the NAACP New York City Youth Council picket in 1937 on behalf of anti-lynching legislation in front of the Strand Theater in New York City’s Times Square. That same year an anti-lynching bill passed the U.S. House, but died in the Senate.

The failure to pass anti-lynching legislation underscored the limitations of reform under FDR. In this instance—unlike in the early 1920s when there were no black Representatives in Congress—an African-American Member of Congress, Arthur Mitchell, refused to endorse legislation supported by the NAACP. Moreover, Mitchell introduced his own anti-lynching bill in the 74th Congress (1935–1937), which critics assailed as weak for providing far more lenient sentences and containing many legal ambiguities. Given the choice, Southerners favored Mitchell’s bill, although they amended it considerably in the Judiciary Committee, further weakening its provisions. Meanwhile, Mitchell waged a public relations blitz on behalf of his bill, including a national radio broadcast. Only when reformers convincingly tabled Mitchell’s proposal early in the 75th Congress (1937–1939) did he enlist in the campaign to support the NAACP measure—smarting from the realization that Judiciary Committee Chairman Hatton Sumners of Texas had misled and used him. The NAACP measure passed the House in April 1937 by a vote of 277 to 120 but was never enacted into law. Instead, Southerners in the Senate effectively buried it in early 1938 by blocking efforts to bring it to an up-or-down vote on the floor. 48 The rivalry between Mitchell and the NAACP, meanwhile, forecast future problems. Importantly, it revealed that African-American Members and outside advocacy groups sometimes worked at cross-purposes, confounding civil rights supporters in Congress and providing opponents a wedge for blocking legislation.


Why was the United States so prosperous from after WW2 until the 1970s?

Every other western nation was destroyed so we were the world's manufacturers.

Also much of their most productive workforce was killed or maimed. The UK was the only major power that wasn’t destroyed and they were in a desperate financial state and continued rationing into the 50s.

This isn't unique to the US. Almost all western countries have a similar graph, strong growth post-war until the 70s, then slower growth or plateauing

Because we had everyone's gold. Also we moved off the gold standard in the 70s.

We were able to turn our wartime productivity levels into peacetime productivity levels. However, unlike Western Europe, we didn't have the problem of rebuilding massive amounts of buildings and other infrastructure that was destroyed in the war.

We had massive comparative advantage.

In the early 1970's, Nixon and Carter both began to handcuff the economy more, hurting our productivity as Asia and Europe closed the gap in competitive advantage, in part fueled by the removal of the Gold Standard, allowing currencies to float freely in the markets. That was a massive benefit for US Consumers, but less so for US Workers.

Something you tend to hear is that presidents back then taxed and spent a lot, which led to prosperity. How can I refute this?

A nice myth, now reinforced by the Keynesian system (which promotes government spending) being given the lion's share of education. And public education loves Keynesian economics, because that system 'personally' benefits from it - government spending funds education.

FDR's spending to exit the Depression of the 1930's was a massive failure. The government was still strangling the economy, preventing the reset of activity which was necessary for healthy economic activity. Hoover's interference was shoved under the rug - government pressure for higher wages was a problem during the Hoover Administration first.

Here's a thought on the system that originated then, and we still have today.

We have tried spending money. We are spending more money than
we have ever spent before and it does not work. And I have just none
interest, and if I am wrong . . . somebody else can have my job. I want
to see this country prosperous. I want to see people get a job, I want
to see people get enough to eat. We have never made good on our
promises. . . . I say after eight years of this administration we have
just as much unemployment as when we started . . . . And an enormous
debt to boot!

Henry Morgenthau, Jr., Secretary of the Treasury, 1934 -1945


Why didn't the USA move to the left after WWII? - History

Digital History TOPIC ID 127

The film industry changed radically after World War II, and this change altered the style and content of the films made in Hollywood. After experiencing boom years from 1939 to 1946, the film industry began a long period of decline. Within just seven years, attendance and box receipts fell to half their 1946 levels.

Part of the reason was external to the industry. Many veterans returning from World War II got married, started families, attended college on the GI Bill, and bought homes in the suburbs. All these activities took a toll on box office receipts. Families with babies tended to listen to the radio rather than go to the movies college students placed studying before seeing the latest film and newlyweds purchasing homes, automobiles, appliances, and other commodities had less money to spend on movies.

Then, too, especially after 1950, television challenged and surpassed the movies as America's most popular entertainment form. In 1940, there were just 3,785 TV sets in the United States. Two decades later, nine homes in every ten had at least one TV set. For preceding Americans, clothing styles, speech patterns, and even moral attitudes and political points of view had been shaped by the movies. For post-World War II Americans, television largely took the movies' place as a dominant cultural influence. The new medium reached audiences far larger than those attracted by motion pictures, and it projected images right into family's living rooms.

Internal troubles also contributed to Hollywood's decline. Hollywood's founding generation--Harry Cohn, Samuel Goldwyn Louis B. Mayer, Darryl Zanuck--retired or were forced out as new corporate owners, lacking movie experience, took over. The film companies had high profiles, glamour, undervalued stock, strategically located real estate, and film libraries which television networks desperately needed. In short, they were perfect targets for corporate takeovers. The studios reduced production, sold off back lots, and made an increasing number of pictures in Europe, where costs were lower.

Meanwhile, Hollywood's foreign market began to vanish. Hollywood had depended on overseas markets for as much as 40 percent of its revenue. But in an effort to nurture their own film industries and prevent an excessive outflow of dollars, Britain, France, and Italy imposed stiff import tariffs and restrictive quotas on imported American movies. With the decline in foreign markets, movie making became a much riskier business.

Then an antitrust ruling separated the studios from their theater chains. In 1948, the United States Supreme Court handed down its decision in the Paramount case, which had been working its ways through the courts for almost a decade. The court's decree called for the major studios to divest themselves of their theater chains. In addition to separating theater and producer- distributor companies, the court also outlawed block booking, the fixing of admissions prices, unfair runs and clearances, and discriminatory pricing and purchasing arrangements. With this decision, the industry the moguls built--the vertically integrated studio--died. If the loss of foreign revenues shook the financial foundation of the industry, the end of block booking (a practice whereby the exhibitor is forced to take all of a company's pictures to get any of that company's pictures) shattered the weakened buttress. Film making had become a real crap shoot.

One result of the Paramount decision and the end of the monopoly of film making by the majors was an increase in independent productions. Yet despite a host of innovations and gimmicks--including 3-D, Cinerama, stereophonic sound, and cinemascope--attendance continued to fall.

Hollywood also suffered from Congressional probes of communist influence in the film industry. In the late 1930s, the House of Representatives established the Un-American Activities Committee (HUAC) to combat subversive right-wing and left-wing movements. Its history was less than distinguished. From the first it tended to see subversive Communists everywhere at work in American society. HUAC even announced that the Boy Scouts were Communist infiltrated. During the late 1940s and early 1950s HUAC picked up the tempo of its investigation, which it conducted in well-publicized sessions. Twice during this period HUAC traveled to Hollywood to investigate Communist infiltration in the film industry.

HUAC first went to Hollywood in 1947. Although it didn't find the party line preached in the movies, it did call a group of radical screenwriters and producers into its sessions to testify. Asked if they were Communists, the "Hollywood Ten" refused to answer questions about their political beliefs. As Ring Lardner, Jr., one of the ten, said, "I could answer. but if I did, I would hate myself in the morning." They believed that the First Amendment protected them. In the politically charged late 1940s, however, their rights were not protected. Those who refused to divulge their political affiliations were tried for contempt of Congress, sent to prison for a year, and blacklisted.

HUAC went back to Hollywood in 1951. This time it called hundreds of witnesses from both the political right and the political left. Conservatives told HUAC that Hollywood was littered with "Commies." Walt Disney even recounted attempts to have Mickey Mouse follow the party line. Of the radicals, some talked but most didn't. To cooperate with HUAC entailed "naming names"--that is, informing on one's friends and political acquaintances. Again, those who refused to name names found themselves unemployed and unemployable. All told, about 250 directors, writers, and actors were black listed.

In 1948, writer Lillian Hellman denounced the industry's moral cowardice in scathing terms: "Naturally, men scared to make pictures about the American Negro, men who only in the last year allowed the word Jew to be spoken in a picture, who took more than ten years to make an anti-fascist picture, these are frightened men and you pick frightened men to frighten first. Judas goats, they'll lead the others to slaughter for you."

The HUAC hearings and blacklistings discouraged Hollywood from producing politically controversial films. Fear that a motion picture dealing with the life of Hiawatha might be regarded as communist propaganda led Monogram Studio to shelve the project. As The New York Times explained: "It was Hiawatha's efforts as a peacemaker among warring Indian tribes that gave Monogram particular concern. These it was decided might cause the picture to be regarded as a message for peace and therefore helpful to present communist designs." The hearings encouraged Hollywood to produce musicals, biblical epics, and other politically neutral films.

The HUAC hearings also convinced Hollywood producers to make 50 strongly anticommunist films between 1947 and 1954. Most were second-rate movies starring third-rate actors. The films assured Americans that Communists were thoroughly bad people--they didn't have children, they exhaled cigarette smoke too slowly, they murdered their "friends," and they went berserk when arrested. As one film historian has commented, the communists in these films even looked alike most were "apt to be exceptionally haggard or disgracefully pudgy," and there was certainly "something terribly wrong with a woman if her slip straps showed through her blouse." If these films were bad civic lessons, they did have an impact. They seemed to confirm HUAC's position that Communists were everywhere, that subversives lurked in every shadow. It is ironic that at the same time that HUAC was conducting its investigations of communist subversion, moral censorship of the movies began to decline. In 1949, Vittorio de Sica's The Bicycle Thief became the first film to be successfully exhibited without a seal of approval. Despite its glimpses of a brothel and a boy urinating, this Italian film's neo-realist portrait of a poor man's search for his stolen bicycle received strong editorial support from newspapers and was shown in many theaters.

In 1952, the Supreme Court reversed a 1915 decision and extended First Amendment protections of free speech to the movies. The landmark case overturned an effort by censors in New York State to ban Roberto Rosselini's film The Miracle on grounds of sacrilege. In addition, the court decreed that filmmakers could challenge censors' findings in court. The next year, Otto Preminger's sex comedy The Moon Is Blue became the first major American film to be released without the code's seal. Even though the film was condemned by the Legion of Decency for its use of the words "virgin" and "pregnant," efforts to boycott the film fizzled and the film proved to be a box office success. In 1966, the film industry abandoned the Production Code, replacing it with a film rating system which is still in force.


The same person

A conspiracy theory regarding the band began circulating over the years amongst their fans and the press. They found out that Benny and Bjorn have never been photographed together alone, leading to spreading rumors that they are, in fact, the same person. However, forensic scientists who examined the case argued that if the pictures are studied carefully, Benny has his hair parted to the left while Bjorn has his part to the left.


The World War II-Era Actress Who Invented Wi-Fi: Hedy Lamarr

As we face the uncertainty of the current COVID-19 pandemic, one helpful invention has eased the anxieties of staying at home and assists us daily with our new teleworking lives. Wi-Fi, or wireless fidelity, allows us to stay plugged into the internet while roaming our homes for the perfect spot to type up emails or binge-watch our favorite shows. As with the invention of the computer, the technology that made Wi-Fi possible came about during another devastating global event: World War II. The head inventor wasn’t a scientist or engineer, but a famous Hollywood actress with an obsession with tinkering.

Hedy Lamarr made it big in acting before ever moving to the United States. Her role in the Czech film Ecstasy got international attention in 1933 for containing scandalous, intimate scenes that were unheard of in the movie industry up until then.

Backlash from her early acting career was the least of her worries, however, as tensions began to rise in Europe. Lamarr, born Hedwig Eva Maria Kiesler, grew up in a Catholic household in Austria, but both of her parents had a Jewish heritage. In addition, she was married to Friedrich Mandl, a rich ammunition manufacturer with connections to both Fascist Italy and Nazi Germany.

Her time with Friedrich Mandl was bittersweet. While the romance quickly died and Mandl became very possessive of his young wife, Lamarr was often taken to meetings on scientific innovations in the military world. These meetings are said to have been the spark that led to her becoming an inventor. As tensions in both her household and in the world around her became overwhelming, she fled Europe and found her way to the United States through a job offer from Hollywood’s MGM Studios.

Lamarr became one of the most sought-after leading women in Hollywood and starred in popular movies like the 1939 film Algiers, but once the United States began helping the Allies and preparing to possibly enter the war, Lamarr almost left Hollywood forever. Her eyes were no longer fixed on the bright lights of the film set but on the flashes of bombs and gunfire. Lamarr wanted to join the Inventors’ Council in Washington, DC, where she thought she would be of better service to the war effort.

Lamarr’s path to inventing the cornerstone of Wi-Fi began when she heard about the Navy’s difficulties with radio-controlled torpedoes. She recruited George Antheil, a composer she met through MGM Studios, in order to create what was known as a Secret Communication System.

The idea behind the invention was to create a system that constantly changed frequencies, making it difficult for the Axis powers to decode the radio messages. The invention would help the Navy make their torpedo systems become more stealthy and make it less likely for the torpedoes to be rendered useless by enemies.

Lamarr was the brains behind the invention, with her background knowledge in ammunition, and Antheil was the artist that brought it to life, using the piano for inspiration. In 1942, under her then-married name, Hedy Kiesler Markey, she filed for a patent for the Secret Communication System, patent case file 2,292,387, and proposed it to the Navy.

  • Patent Case File No. 2,292,387, Secret Communication System, Inventors Hedy Kiesler Markey and George Antheil, 1942. (National Archives Identifier 167820368)
  • Patent Case File No. 2,292,387, Secret Communication System, Inventors Hedy Kiesler Markey and George Antheil, 1942. (National Archives Identifier 167820368)

The first part of Lamarr and Antheil’s Secret Communication System story did not see a happy Hollywood ending. The Navy refused to accept the new technology during World War II. Not only did the invention come from a civilian, but it was complex and ahead of its time.

As the invention sat unused, Lamarr continued on in Hollywood and found other ways to help with the war effort, such as working with the USO. It wasn’t until Lamarr’s Hollywood career came to an end that her invention started gaining notice.

Around the time Lamarr filmed her last scene with the 1958 film The Female Animal, her patented invention caught the attention of other innovators in technology. The Secret Communication System saw use in the 1950s during the development of CDMA network technology in the private sector, while the Navy officially adopted the technology in the 1960s around the time of the Cuban Missile Crisis. The methods described in the patent assisted greatly in the development of Bluetooth and Wi-Fi.

Despite the world finally embracing the methods of the patent as early as the mid-to-late 1950s, the Lamarr-Antheil duo were not recognized and awarded for their invention until the late 1990s and early 2000s. They both received the Electronic Frontier Foundation Pioneer Award and the Bulbie Gnass Spirit of Achievement Bronze Award, and in 2014 they were inducted into the National Inventors Hall of Fame.

Hedy Lamarr never had any formal training yet was able to incorporate her life experiences and artistic imagination into one of the most important inventions of the technological age. During a dark, chaotic time, she was able to adopt the inspiration to try to help change the world for the better.

As we sit at home, waiting for the war against COVID-19 to reach its turning point, some may draw inspiration from Hedy Lamarr and ask themselves: what can I create today?


A Brief History of the Cheez-It

Dayton’s historic Edgemont neighborhood is cocooned inside a crook in the Great Miami River, a winding waterway that snakes through the heart of southwest Ohio. Two miles from downtown, with its air of industry, the community hearkens to a time when Dayton was hailed “The City of A Thousand Factories.”

In the early 20th century, inside a foregone factory on the corner of Concord and Cincinnati Streets, Green & Green cracker company cooked up its Edgemont product line, a collection of grahams, crackers and gingersnaps that were shipped across the region. But of the company’s four Edgemont products, only one, in particular, a flaky one-by-one-inch cheese cracker, would revolutionize snack time. On May 23, 1921, when Green & Green decided to trademark the tasty treat’s unique name, the Cheez-It was born.

“In 1921, Cheez-It didn’t mean anything, so Green & Green marketed the cracker as a ‘baked rarebit,’ ” says Brady Kress, president & CEO of Dayton’s Carillon Historical Park, a nationally recognized open-air museum centered on the city’s history of innovation. (Inside Carillon Brewing Company, a fully operating 1850s brewery at the park, costumed interpreters still bake crackers over an open hearth.) “People were familiar with rarebit, a sort of melted cheddar beer cheese spread over toast. Cheez-It offered the same great taste, only baked down into a cracker that will last.”

Cheez-It’s 11-month shelf life is impressive, but so is the company’s history. This month, America's iconic orange cracker turns 100. But the Cheez-It story stretches even further back than that.

The popular online food marketplace Goldbelly offered a limited-edition Cheez-Itennial Cake for a few days this week to celebrate the anniversary. (Kellogg)

In 1841, Dr. William W. Wolf moved to Dayton to practice homeopathy, a branch of alternative medicine that believes in the healing power of food. Hailed Dayton’s “Cracker King,” Wolf concocted the Wolf Cracker, a curious hard-butter snack made for medicinal purposes.

“In the 19th century, crackers were linked to Christian physiology and sectarian medical practitioners,” says Lisa Haushofer, a senior research associate at the University of Zurich’s Institute for Biomedical Ethics and History of Medicine. “Christian physiologists like Sylvester Graham, of Graham Cracker fame, were concerned about a modern diet that contained too many stimulating substances.” (In addition to being a cracker evangelist, Graham was also a pro-temperance Presbyterian minister who preached a vegetarian diet). Wolf echoed Graham’s concerns that food was far too rousing (though Graham also dubiously believed his crackers could cure licentiousness), so he launched the Wolf Cracker Bakery to churn out his wholesome snacks.

“They believed there was too much nourishment per food unit in modern bread, too much excitement,” says Haushofer. “So they recommended grain products made from coarse flour, which, they believed, contained a more natural ratio of nourishing and non-nourishing parts. Crackers were considered health food.”

According to Haushofer, homeopaths at the time were also concerned about digestibility, and since they believed heating food aided digestion, baked Wolf Crackers were just what the doctor ordered. But Wolf’s patients weren’t the only ones after his crackers. What started as a medical remedy soon became a sought-after treat.

In the 1870s, while living on the barren plains of North Dakota, Dayton natives J.W. and Weston Green often longed for a taste of home. “In those days food supplies were both expensive and scarce in that region,” wrote the Dayton Journal Herald in its October 31, 1907, edition, “and the father and son regularly sent back to their old home city, Dayton[,] for those necessities that could not be obtained there. ‘Invariably,’ Mr. Green says, ‘we would include in that order a good supply of … the ‘Wolfe Cracker’ [sic].”

J.W. Green never forgot the savory, buttery, nut-like flavor of Wolf Crackers. In 1897, when Wolf died, Green purchased the Wolf Bakery Company, then enlisted his son, Weston Green, to join him in business. The Greens renamed the enterprise Green & Green Company, and while Wolf’s recipe remained the same, they rebranded the doctor’s famous treat as the “Dayton Cracker.”

By the turn of the 20th century, Dayton held more patents, per capita, than any U.S. city surrounded by this innovative environment, Green & Green flourished, expanding its operations to nearby Springfield and Lima, and delivering baked goods across southwest Ohio. But soon, the company’s crackers became more than a regional concern. During World War I, Green & Green fired up its ovens for the war effort.

“All our facilities but one little oven that can’t be used for Hard Bread will be speeded up to keep two car loads a day going by express,” read a Green & Green ad in the Dayton Daily News’s July 14, 1918, edition … “that OUR BOYS at the front may have their Fighting Bread.”

Though far less tasty than the Dayton Cracker, Dayton’s Fighting Bread sustained countless soldiers during the Great War. Typically made from salt, flour and water, Hard Bread—also known as hardtack, teeth dullers or jawbreakers—was often soaked in water before being served. If stored improperly, weevils and maggots made Hard Bread their home, prompting soldiers to dub the wartime ration “worm castles.”

“We are mighty glad and proud to be a cog in the big machine that will win the war,” read Green & Green’s ad. However, Doughboys weren’t the only ones helping win the war. “P.S. We could still use a few more women in the packing of Hard Bread.”

During World War I, Green & Green fired up its ovens for the war effort. This ad appeared in the Dayton Daily News's July 14, 1918 edition. (Dayton Daily News)

After World War I, Green & Green Company sidelined Hard Bread in favor of more flavorful fare. By Armistice Day, the Dayton Cracker (still made with Wolf’s original recipe) had been baked in Dayton for nearly 80 years. But while the hard butter-cracker was a local treasure, customers yearned for a delicate, flakier treat. Soon, Green & Green launched its Edgemont line, and in 1921, unveiled the “baked rarebit,” known as the Cheez-It.

“Welsh Rarebit, at its most basic form, is essentially a cheese sauce spread on toast,” says Rachael Spears, a living history specialist at Dayton’s Carillon Historical Park. “Some 19th-century English recipes specifically call for cheddar cheese. To this day, Cheez-It still advertises 100 percent real cheese, which draws a connection to its rarebit roots.”

But in 1921, Americans needed more than a novel snack. Following the Great War, the global economy dipped, and American wallets were increasingly thin. “Rarebit is a lesson in frugality,” says Kress. “It’s a nutritious dish that doesn’t cost a lot of money. When it’s baked down into a Cheez-It, it becomes a tasty treat. And just like hardtack, if you store it correctly, it will stay for a very long time. You don’t run the risk of it growing weevils.”

On May 23, 1921, when Green & Green decided to trademark the tasty treat’s unique name, the Cheez-It was born.

In 1915, one pound of Green & Green crackers sold for 10 cents, roughly $2.65 in 2021 dollars. “When Uncle Sam picked men for his army overseas,” read a June 1920 Green & Green ad, “he also picked foods that would keep those picked men robust and healthy—fit for the strenuous duties ahead of them. Just as the crackers for our soldiers kept sweet and fresh in tins, so Edgemont Crackers … keep crisp and creamy in the Family Tin. Ask mother to keep a tin in her pantry.”

Cheez-Its kept Americans fed during the post-war recession, throughout the Roaring Twenties, and at the onset of the Great Depression. But by 1932, Green & Green packed up its last Family Tin and sold the business to Kansas City’s Loose-Wiles Biscuit Company.

In 1947, the Loose-Wiles Biscuit Company became the Sunshine Biscuit Company in 1996, Keebler acquired Sunshine and in 2001, Kellogg acquired Keebler.

In this photo from the 1930s, workers at the Sunshine Biscuit Co.in Dayton fill Cheez-It boxes. (From the collections of Dayton History)

“The Cheez-It name has accompanied the baked cracker since its creation in 1921,” says Jeff Delonis, senior director of marketing for Cheez-It. “The original Cheez-It packaging was green and white. In the 1930s, red was introduced into the brand logo, and by the 1940s, the box included the iconic red and yellow-orange colors that remain today. The general shape and look of the cracker has largely stayed the same.”

Cheez-Its may still look the same, but the cracker’s production has soared. Once baked on the corner of Concord and Cincinnati Streets in Dayton’s Edgemont neighborhood, then shipped to regional grocers, Cheez-It sold more than 400 million packages in the U.S. alone last year.

“It’s super fun to think about all the cities around the country that were producing foods for local audiences,” says Kress. “Every city had them. Here’s an idea that came out of Dayton, Ohio.”

But “baked rarebit,” once a prevalent idiom used to describe an obscure cracker, has since faded, replaced by the now-ubiquitous term, Cheez-It.

“When you bake a cracker, you roll the dough out thin, kind of like a pie crust,” says Spears. “But at the heart, it’s like a thin, crispy biscuit. When you bite into a Cheez-It, you get those nice layers. Those are the layers that form if you cook it a bit.”

Like the Cheez-It itself, we need only bite into the snack’s history to uncover countless compelling layers.


Bibliography

Aitken, Hugh G. J. The Continuous Wave: Technology and American Radio, 1900-1932. Princeton, N.J.: Princeton University Press, 1985.

Archer, Gleason Leonard. Big Business and Radio. New York, Arno Press, 1971.

Benjamin, Louise Margaret. Freedom of the Air and the Public Interest: First Amendment Rights in Broadcasting to 1935. Carbondale: Southern Illinois University Press, 2001.

Bilby, Kenneth. The General: David Sarnoff and the Rise of the Communications Industry. New York: Harper & Row, 1986.

Bittner, John R. Broadcast Law and Regulation. Englewood Cliffs, N.J.: Prentice-Hall, 1982.

Brown, Robert J. Manipulating the Ether: The Power of Broadcast Radio in Thirties America. Jefferson, N.C.: McFarland & Co., 1998.

Campbell, Robert. The Golden Years of Broadcasting: A Celebration of the First Fifty Years of Radio and TV on NBC. New York: Scribner, 1976.

Douglas, George H. The Early Years of Radio Broadcasting. Jefferson, NC: McFarland, 1987.

Douglas, Susan J. Inventing American Broadcasting, 1899-1922. Baltimore: Johns Hopkins University Press, 1987.

Erickson, Don V. Armstrong’s Fight for FM Broadcasting: One Man vs Big Business and Bureaucracy. University, AL: University of Alabama Press, 1973.

Fornatale, Peter and Joshua E. Mills. Radio in the Television Age. New York: Overlook Press, 1980.

Godfrey, Donald G. and Frederic A. Leigh, editors. Historical Dictionary of American Radio. Westport, CT: Greenwood Press, 1998.

Head, Sydney W. Broadcasting in America: A Survey of Television and Radio. Boston: Houghton Mifflin, 1956.

Hilmes, Michele. Radio Voices: American Broadcasting, 1922-1952. Minneapolis: University of Minnesota Press, 1997.

Jackaway, Gwenyth L. Media at War: Radio’s Challenge to the Newspapers, 1924-1939. Westport, CT: Praeger, 1995.

Jolly, W. P. Marconi. New York: Stein and Day, 1972.

Jome, Hiram Leonard. Economics of the Radio Industry. New York: Arno Press, 1971.

Lewis, Tom. Empire of the Air: The Men Who Made Radio. New York: Edward Burlingame Books, 1991.

Ladd, Jim. Radio Waves: Life and Revolution on the FM Dial. New York: St. Martin’s Press, 1991.

Lichty, Lawrence Wilson and Malachi C. Topping. American Broadcasting: A Source Book on the History of Radio and Television (first edition). New York: Hastings House, 1975.

Lyons, Eugene. David Sarnoff: A Biography (first edition). New York: Harper & Row, 1966.

MacDonald, J. Fred. Don’t Touch That Dial! Radio Programming in American Life, 1920-1960. Chicago: Nelson-Hall, 1979.

Maclaurin, William Rupert. Invention and Innovation in the Radio Industry. New York: Arno Press, 1971.

Nachman, Gerald. Raised on Radio. New York: Pantheon Books, 1998.

Rosen, Philip T. The Modern Stentors: Radio Broadcasters and the Federal Government, 1920-1934. Westport, CT: Greenwood Press, 1980.

Sies, Luther F. Encyclopedia of American Radio, 1920-1960. Jefferson, NC : McFarland, 2000.

Slotten, Hugh Richard. Radio and Television Regulation: Broadcast Technology in the United States, 1920-1960. Baltimore: Johns Hopkins University Press, 2000.

Smulyan, Susan. Selling Radio: The Commercialization of American Broadcasting, 1920-1934. Washington: Smithsonian Institution Press, 1994.

Sobel, Robert. RCA. New York: Stein and Day/Publishers, 1986.

Sterling, Christopher H. and John M. Kittross. Stay Tuned. Belmont, CA: Wadsworth, 1978.

Weaver, Pat. The Best Seat in the House: The Golden Years of Radio and Television. New York: Knopf, 1994.


Watch the video: Fall Rebecca: Polizei sicher, dass sie tot ist. Achtung Fahndung (June 2022).