For the first time in US history, more than one of every 100 adults is in jail or prison, according to a new report documenting America's rank as the world's No. 1 incarcerator. It urges states to curtail corrections spending by placing fewer low-risk offenders behind bars.
Using state-by-state data, the report says 2,319,258 Americans were in jail or prison at the start of 2008 - one out of every 99.1 adults. Whether per capita or in raw numbers, it's more than any other nation.
The report, released Thursday by the Pew Center on the States, said the 50 states spent more than $49 billion on corrections last year, up from less than $11 billion 20 years earlier. The rate of increase for prison costs was six times greater than for higher education spending, the report said.
The steadily growing inmate population "is saddling cash-strapped states with soaring costs they can ill afford and failing to have a clear impact either on recidivism or overall crime," the report said.
Susan Urahn, managing director of the Pew Center on the States, said budget woes are pressuring many states to consider new, cost-saving corrections policies that might have been shunned in the recent past for fear of appearing soft on crime.
"We're seeing more and more states being creative because of tight budgets," she said in an interview. "They want to be tough on crime. They want to be a law-and-order state. But they also want to save money, and they want to be effective."
The report cited Kansas and Texas as states that have acted decisively to slow the growth of their inmate population. They are making greater use of community supervision for low-risk offenders and employing sanctions other than reimprisonment for offenders who commit technical violations of parole and probation rules.
"The new approach, born of bipartisan leadership, is allowing the two states to ensure they have enough prison beds for violent offenders while helping less dangerous lawbreakers become productive, taxpaying citizens," the report said.
While many state governments have shown bipartisan interest in curbing prison growth, there also are persistent calls to proceed cautiously.
"We need to be smarter," said David Muhlhausen, a criminal justice expert with the conservative Heritage Foundation. "We're not incarcerating all the people who commit serious crimes. But we're also probably incarcerating people who don't need to be."
According to the report, the inmate population increased last year in 36 states and the federal prison system.
The largest percentage increase - 12 percent - was in Kentucky, where Gov. Steve Beshear highlighted the cost of corrections in his budget speech last month. He noted that the state's crime rate had increased only about 3 percent in the past 30 years, while the state's inmate population has increased by 600 percent.
The report was compiled by the Pew Center's Public Safety Performance Project, which is working with 13 states on developing programs to divert offenders from prison without jeopardizing public safety.
"Getting tough on criminals has gotten tough on taxpayers," said the project's director, Adam Gelb.
According to the report, the average annual cost per prisoner was $23,876, with Rhode Island spending the most ($44,860) and Louisiana the least ($13,009). It said California - which faces a $16 billion budget shortfall - spent $8.8 billion on corrections last year, while Texas, which has slightly more inmates, was a distant second with spending of $3.3 billion.
On average, states spend 6.8 percent of their general fund dollars on corrections, the report said. Oregon had the highest spending rate, at 10.9 percent; Alabama the lowest at 2.6 percent.
Four states - Vermont, Michigan, Oregon and Connecticut - now spend more on corrections than they do on higher education, the report said.
"These sad facts reflect a very distorted set of national priorities," said Sen. Bernie Sanders, an independent from Vermont, referring to the full report. "Perhaps, if we adequately invested in our children and in education, kids who now grow up to be criminals could become productive workers and taxpayers."
The report said prison growth and higher incarceration rates do not reflect an increase in the nation's overall population. Instead, it said, more people are behind bars mainly because of tough sentencing measures, such as "three-strikes" laws, that result in longer prison stays.
"For some groups, the incarceration numbers are especially startling," the report said. "While one in 30 men between the ages of 20 and 34 is behind bars, for black males in that age group the figure is one in nine."
The racial disparity for women also is stark. One of every 355 white women aged 35 to 39 is behind bars, compared with one of every 100 black women in that age group.
The nationwide figures, as of Jan. 1, include 1,596,127 people in state and federal prisons and 723,131 in local jails. That's out of almost 230 million American adults.
The report said the United States incarcerates more people than any other nation, far ahead of more populous China with 1.5 million people behind bars. It said the US also is the leader in inmates per capita (750 per 100,000 people), ahead of Russia (628 per 100,000) and other former Soviet bloc nations which round out the Top 10.
The US also is among the world leaders in capital punishment. According to Amnesty International, its 53 executions in 2006 were exceeded only by China, Iran, Pakistan, Iraq and Sudan. (info from The Assocuated Press)
Friday, February 29, 2008
Thursday, February 28, 2008
1958: first black hockey player in NHL
Willie O'Ree, (born in 1935, in Fredericton, New Brunswick, Canada) is a retired professional ice hockey player, known best for being the first black player in the National Hockey League. He played as a winger for the Boston Bruins.
He is frequently but erroneously referred to as the first African American player, though he is Canadian. O'Ree is often called "the Jackie Robinson of ice hockey" due to breaking the color barrier in the sport.
Midway through his second minor-league season with the Quebec Aces, O'Ree was called up to the Boston Bruins of the NHL to replace an injured player. O'Ree was 95% blind in his right eye due to being hit there by an errant puck two years earlier, which normally would have precluded him from playing in the NHL. However, O'Ree managed to keep it secret, and made his NHL debut with the Bruins on January 18, 1958, against the Montreal Canadiens, becoming the first black player in league history. He played in only two games that year, and came back in 1961 to play 43 games. He scored four goals and 10 assists in 1961.
Willie O'Ree noted that "racist remarks were much worse in the US cities than in Toronto and Montreal," the two Canadian cities hosting NHL teams at the time, and that "Fans would yell, 'Go back to the South' and 'How come you're not picking cotton?' Things like that. It didn't bother me. I just wanted to be a hockey player, and if they couldn't accept that fact, that was their problem, not mine."
After O'Ree, there was no other black player in the NHL until fellow Canadian Mike Marson was drafted by the Washington Capitals in 1974. There were 17 black players in the NHL as of the mid-2000s, the most prominent including Canadians Jarome Iginla and Anson Carter and American Mike Grier. NHL players are now required to enroll in a diversity training seminar before each season, and racially based verbal abuse is punished through suspensions and fines.
O'Ree was inducted into the New Brunswick Sports Hall of Fame in 1984. He later became the Director of Youth Development for the NHL/USA Hockey Diversity Task Force, a non-profit program for minority youth that encourages them to learn and play hockey.
On January 19, 2008, the Bruins and NHL deputy commissioner Bill Daly honoured O'Ree at TD Banknorth Garden in Boston to mark the 50th anniversary of his NHL debut. In addition, The Sports Museum of New England established a special exhibit on O'Ree's career, comprising many items on loan from his personal collection. Those in attendance included a busload of friends from O'Ree's hometown of Fredericton. Two days earlier, the City of Fredericton honoured him by naming a new sports complex after him. (info fro mWikipedia)
He is frequently but erroneously referred to as the first African American player, though he is Canadian. O'Ree is often called "the Jackie Robinson of ice hockey" due to breaking the color barrier in the sport.
Midway through his second minor-league season with the Quebec Aces, O'Ree was called up to the Boston Bruins of the NHL to replace an injured player. O'Ree was 95% blind in his right eye due to being hit there by an errant puck two years earlier, which normally would have precluded him from playing in the NHL. However, O'Ree managed to keep it secret, and made his NHL debut with the Bruins on January 18, 1958, against the Montreal Canadiens, becoming the first black player in league history. He played in only two games that year, and came back in 1961 to play 43 games. He scored four goals and 10 assists in 1961.
Willie O'Ree noted that "racist remarks were much worse in the US cities than in Toronto and Montreal," the two Canadian cities hosting NHL teams at the time, and that "Fans would yell, 'Go back to the South' and 'How come you're not picking cotton?' Things like that. It didn't bother me. I just wanted to be a hockey player, and if they couldn't accept that fact, that was their problem, not mine."
After O'Ree, there was no other black player in the NHL until fellow Canadian Mike Marson was drafted by the Washington Capitals in 1974. There were 17 black players in the NHL as of the mid-2000s, the most prominent including Canadians Jarome Iginla and Anson Carter and American Mike Grier. NHL players are now required to enroll in a diversity training seminar before each season, and racially based verbal abuse is punished through suspensions and fines.
O'Ree was inducted into the New Brunswick Sports Hall of Fame in 1984. He later became the Director of Youth Development for the NHL/USA Hockey Diversity Task Force, a non-profit program for minority youth that encourages them to learn and play hockey.
On January 19, 2008, the Bruins and NHL deputy commissioner Bill Daly honoured O'Ree at TD Banknorth Garden in Boston to mark the 50th anniversary of his NHL debut. In addition, The Sports Museum of New England established a special exhibit on O'Ree's career, comprising many items on loan from his personal collection. Those in attendance included a busload of friends from O'Ree's hometown of Fredericton. Two days earlier, the City of Fredericton honoured him by naming a new sports complex after him. (info fro mWikipedia)
Wednesday, February 27, 2008
2005: first woman elected president in Africa
Ellen Johnson-Sirleaf (born in 1938) is the current president of Liberia, Liberia's first elected female president, and Africa's first elected female president.
Elected in 2005, she is the second elected black woman head of state in the world and also second female leader of Liberia after Ruth Perry (who assumed leadership after an overthrow. She is known as the "Iron Lady".
Her grandfather was a German who married a Liberian woman. The grandfather was forced to leave the country during World War I.
Two of Johnson-Sirleaf's grandparents were indigenous Liberians. Her father was the son of the Gola Chief Jahmale, and Jenneh, one of his many wives. As a result of her grandfather's friendship and loyalty to President Hilary Richard Wright Johnson and on the advice of the President, her father was brought to Johnson, his name changed to Johnson and he was given to the settler family, McCritty.
Johnson-Sirleaf received a Bachelor of Science in Accounting at the University of Wisconsin in 1964, an economics diploma from the University of Colorado in 1970, and a Master of Public Administration from Harvard in 1971. She is a member of Alpha Kappa Alpha Sorority, a social action organization and the first collegiate sorority founded by and for Black women.
Returning to Liberia after Harvard, Johnson-Sirleaf became Assistant Minister of Finance in President William Tolbert's administration. In 1980, Tolbert was overthrown and killed by army sergeant Samuel Doe, ending decades of relative stability. Doe represented the Krahn ethnic group and was the first Liberian president not to be descended from the elite ex-American slave community. For the next ten years, Doe allowed the Krahn people to dominate public life.
After the overthrow of Tolbert, Johnson-Sirleaf went into exile in Kenya, where she worked for Citibank. She returned to run for Senate in 1985, but when she spoke out against Doe's military regime, she was sentenced to ten years in prison. Released after a short period, she moved to the US. She returned to Liberia again in 1997 in the capacity of an economist, working for the World Bank, and Citibank.
Initially supporting Charles Taylor's bloody rebellion against President Samuel Doe in 1990, she later went on to oppose him, and ran against him in the 1997 presidential elections. She managed only 10% of the votes, as opposed to Taylor's 75%. Taylor charged her with treason. She campaigned for the removal of President Taylor from office, playing an active and supportive role in the transitional government, as the country prepared itself for the 2005 elections. With Taylor's departure, she returned to take over the leadership of the Unity Party.
On March 15, 2006, President Johnson-Sirleaf addressed a joint meeting of the US Congress, asking for American support to help her country “become a brilliant beacon, an example to Africa and the world of what love of liberty can achieve.”
In 2007, President George W. Bush awarded her the Medal of Freedom, the highest civilian award given by the United States.
On July 26, 2007, President Sirleaf celebrated Liberia's 160th Independence Day under the theme "Liberia at 160: Reclaiming the future." She took an unprecedented and symbolic move by asking 25 year old Liberian activist Kimmie Weeks to serve as National Orator for the celebrations. Kimmie became Liberia's youngest National Orator in over a hundred years and delivered a powerful speech. He called for the government to prioritize education and health care. A few days later, President Sirleaf issued an Executive Order making education free and compulsory for all elementary school aged children. (info & photo from Wikipedia)
Elected in 2005, she is the second elected black woman head of state in the world and also second female leader of Liberia after Ruth Perry (who assumed leadership after an overthrow. She is known as the "Iron Lady".
Her grandfather was a German who married a Liberian woman. The grandfather was forced to leave the country during World War I.
Two of Johnson-Sirleaf's grandparents were indigenous Liberians. Her father was the son of the Gola Chief Jahmale, and Jenneh, one of his many wives. As a result of her grandfather's friendship and loyalty to President Hilary Richard Wright Johnson and on the advice of the President, her father was brought to Johnson, his name changed to Johnson and he was given to the settler family, McCritty.
Johnson-Sirleaf received a Bachelor of Science in Accounting at the University of Wisconsin in 1964, an economics diploma from the University of Colorado in 1970, and a Master of Public Administration from Harvard in 1971. She is a member of Alpha Kappa Alpha Sorority, a social action organization and the first collegiate sorority founded by and for Black women.
Returning to Liberia after Harvard, Johnson-Sirleaf became Assistant Minister of Finance in President William Tolbert's administration. In 1980, Tolbert was overthrown and killed by army sergeant Samuel Doe, ending decades of relative stability. Doe represented the Krahn ethnic group and was the first Liberian president not to be descended from the elite ex-American slave community. For the next ten years, Doe allowed the Krahn people to dominate public life.
After the overthrow of Tolbert, Johnson-Sirleaf went into exile in Kenya, where she worked for Citibank. She returned to run for Senate in 1985, but when she spoke out against Doe's military regime, she was sentenced to ten years in prison. Released after a short period, she moved to the US. She returned to Liberia again in 1997 in the capacity of an economist, working for the World Bank, and Citibank.
Initially supporting Charles Taylor's bloody rebellion against President Samuel Doe in 1990, she later went on to oppose him, and ran against him in the 1997 presidential elections. She managed only 10% of the votes, as opposed to Taylor's 75%. Taylor charged her with treason. She campaigned for the removal of President Taylor from office, playing an active and supportive role in the transitional government, as the country prepared itself for the 2005 elections. With Taylor's departure, she returned to take over the leadership of the Unity Party.
On March 15, 2006, President Johnson-Sirleaf addressed a joint meeting of the US Congress, asking for American support to help her country “become a brilliant beacon, an example to Africa and the world of what love of liberty can achieve.”
In 2007, President George W. Bush awarded her the Medal of Freedom, the highest civilian award given by the United States.
On July 26, 2007, President Sirleaf celebrated Liberia's 160th Independence Day under the theme "Liberia at 160: Reclaiming the future." She took an unprecedented and symbolic move by asking 25 year old Liberian activist Kimmie Weeks to serve as National Orator for the celebrations. Kimmie became Liberia's youngest National Orator in over a hundred years and delivered a powerful speech. He called for the government to prioritize education and health care. A few days later, President Sirleaf issued an Executive Order making education free and compulsory for all elementary school aged children. (info & photo from Wikipedia)
1979: first Sony Walkman
Walkman is a popular Sony brand used to market its portable audio and video players. The Walkman introduced and popularized a change in music listening habits, allowing people to carry their own choice of music with them and listen privately through headsets without disturbing others. It prepared the world for the iPod and other modern portable media players.
The original Walkman was marketed in 1979 as the Walkman in Japan, the Soundabout in many other countries including the US, Freestyle in Sweden and the Stowaway in the UK. It was created by audio division engineer Nobutoshi Kihara for Sony co-chairman Morita, who wanted to be able to listen to operas during his frequent transpacific plane trips.
Morita hated the name "Walkman" and asked it to be changed, but relented after being told that an ad campaign had begun with the Walkman name and would be too expensive to change.
However, the first portable personal stereo audio cassette player, called Stereobelt, had been invented earlier by the German-Brazilian Andreas Pavel in 1972, and patented in the US in 1978. After lengthy legal battles, Pavel was finally recognized by Sony in 2003 as the original inventor of the Walkman.
"Walkman", "Pressman", "Watchman", "Scoopman", "Discman", and "Talkman" are trademarks of Sony, and have been applied to many portable entertainment devices. Sony continues to use the "Walkman" brand name for portable audio devices, after the "Discman" name for CD players was dropped in the late 1990s. According to Sony, the plural form is "Walkman Personal Stereos." rather than "Walkmans" or "Walkmen." (info from Wikipedia)
The original Walkman was marketed in 1979 as the Walkman in Japan, the Soundabout in many other countries including the US, Freestyle in Sweden and the Stowaway in the UK. It was created by audio division engineer Nobutoshi Kihara for Sony co-chairman Morita, who wanted to be able to listen to operas during his frequent transpacific plane trips.
Morita hated the name "Walkman" and asked it to be changed, but relented after being told that an ad campaign had begun with the Walkman name and would be too expensive to change.
However, the first portable personal stereo audio cassette player, called Stereobelt, had been invented earlier by the German-Brazilian Andreas Pavel in 1972, and patented in the US in 1978. After lengthy legal battles, Pavel was finally recognized by Sony in 2003 as the original inventor of the Walkman.
"Walkman", "Pressman", "Watchman", "Scoopman", "Discman", and "Talkman" are trademarks of Sony, and have been applied to many portable entertainment devices. Sony continues to use the "Walkman" brand name for portable audio devices, after the "Discman" name for CD players was dropped in the late 1990s. According to Sony, the plural form is "Walkman Personal Stereos." rather than "Walkmans" or "Walkmen." (info from Wikipedia)
Tuesday, February 26, 2008
1979: first (maybe) laptop PC
Designed in 1979 by William Moggridge, for British company Grid Systems, the Grid Compass was one fifth the weight of any model equivalent in performance and was used by NASA on the space shuttle program in the early 1980's.
It was the first computer from Grid, and the very first "clamshell" laptop. It was an expensive business computer with large RAM and data storage memories for its time, and one of the first plasma screens that displayed graphics.
It used a magnesium case that provided protection and acted as a heat-sink, so there was no cooling fan. Stangely, there was no carrying handle. Instead of a disk drive, Compass had a 384 KB non-volatile bubble memory. Software could be loaded from a Grid server, or an external floppy or hard disk drive. It had a built-in modem.
In addition to being first laptop to go into space, it was also used aboard naval vessels, and was carried by paratroopers dropped behind enemy lines to monitor troop and equipment movements and to send back data to the command center.
Grid was way ahead of everyone else, and they felt they could sit back and do nothing. By the time they realized they needed to improve, they couldn't catch up with competitors.
When Grid went out of business they sold technology to Radio Shack for use in Tandy TRS-80 computers. Later Tandy lawyers noticed they had purchased the patent on clamshell computers and notified all the laptop makers that they would begin collecting royalties. After a legal battle, the patent was upheld and a small portion of the price of every clamshell laptop sold by anybody went to Tandy! Apparently Grid had never enforced their patent, so other makers produced similar designs for many years before Tandy started making royalty claims. By then the clamshell design was firmly entrenched and companies had to pay or get out of the laptop business. (info from About.com and old-computers.com, photo from old-computers.com)
It was the first computer from Grid, and the very first "clamshell" laptop. It was an expensive business computer with large RAM and data storage memories for its time, and one of the first plasma screens that displayed graphics.
It used a magnesium case that provided protection and acted as a heat-sink, so there was no cooling fan. Stangely, there was no carrying handle. Instead of a disk drive, Compass had a 384 KB non-volatile bubble memory. Software could be loaded from a Grid server, or an external floppy or hard disk drive. It had a built-in modem.
In addition to being first laptop to go into space, it was also used aboard naval vessels, and was carried by paratroopers dropped behind enemy lines to monitor troop and equipment movements and to send back data to the command center.
Grid was way ahead of everyone else, and they felt they could sit back and do nothing. By the time they realized they needed to improve, they couldn't catch up with competitors.
When Grid went out of business they sold technology to Radio Shack for use in Tandy TRS-80 computers. Later Tandy lawyers noticed they had purchased the patent on clamshell computers and notified all the laptop makers that they would begin collecting royalties. After a legal battle, the patent was upheld and a small portion of the price of every clamshell laptop sold by anybody went to Tandy! Apparently Grid had never enforced their patent, so other makers produced similar designs for many years before Tandy started making royalty claims. By then the clamshell design was firmly entrenched and companies had to pay or get out of the laptop business. (info from About.com and old-computers.com, photo from old-computers.com)
Monday, February 25, 2008
2000: last country allows Internet access
Eritrea, in Northeast Africa, became an Italian colony in 1890. After Italy's losses in World War II, Eritrea was ruled as a United Nations protectorate between 1941 and 1952 administered by the British. From 1951 to 1962, after a UN resolution, Eritrea was an autonomous territory federated with Ethiopia.
The UN decision was made without much attention to the desires of the Eritrean people. Increasing unrest and resistance in Eritrea against the federation with Ethiopia eventually led to a decision by the Marxist Ethiopian government to annex Eritrea as a province in 1962. An Eritrean independence movement formed in the early 1960s which later erupted into a 30-year civil war against successive Ethiopian governments that ended in 1991. Following a UN-supervised referendum in which the Eritrean people overwhelmingly voted for independence, Eritrea declared its independence and gained international recognition in 1993.
Eritrea is a sliver of land along the Red Sea coast, about the same size (36,170 square miles) as Maine, with a population of about 3.5 million. Running counter to the common image of a chaotic Africa, Eritrea is a country with a vision and strong sense of identity. Although the modern infrastructure of the former Italian colony was destroyed, the protracted guerrilla war brought out the best in the Eritrean people. They learned to be completely self-reliant, to work together harmoniously with no thought except the ultimate goal of victory. While the international media ignored or misread the Eritrean struggle for independence, the fighters created a new culture, a mosaic of traditional beliefs, battlefield pragmatism, and political ideology from East and West.
Eritrea allowed its citizens to go online in 2000, the last country to allow access to the Internet. (info from bnet.com, Wikipedia & Habtom Yohannes)
The UN decision was made without much attention to the desires of the Eritrean people. Increasing unrest and resistance in Eritrea against the federation with Ethiopia eventually led to a decision by the Marxist Ethiopian government to annex Eritrea as a province in 1962. An Eritrean independence movement formed in the early 1960s which later erupted into a 30-year civil war against successive Ethiopian governments that ended in 1991. Following a UN-supervised referendum in which the Eritrean people overwhelmingly voted for independence, Eritrea declared its independence and gained international recognition in 1993.
Eritrea is a sliver of land along the Red Sea coast, about the same size (36,170 square miles) as Maine, with a population of about 3.5 million. Running counter to the common image of a chaotic Africa, Eritrea is a country with a vision and strong sense of identity. Although the modern infrastructure of the former Italian colony was destroyed, the protracted guerrilla war brought out the best in the Eritrean people. They learned to be completely self-reliant, to work together harmoniously with no thought except the ultimate goal of victory. While the international media ignored or misread the Eritrean struggle for independence, the fighters created a new culture, a mosaic of traditional beliefs, battlefield pragmatism, and political ideology from East and West.
Eritrea allowed its citizens to go online in 2000, the last country to allow access to the Internet. (info from bnet.com, Wikipedia & Habtom Yohannes)
Friday, February 22, 2008
2008: beginning of the end for fall TV season
It may soon be time to retire the phrase "fall television season.”
NBC Universal took a big step toward undoing one of the television industry’s oldest traditions by announcing Tuesday that it would move to a year-round schedule of staggered program introductions. The move is intended to appeal to advertisers, who crave fresh content to keep viewers tuned in.
And if it succeeds — and leads other broadcast networks to shift from their focus on a mass introduction of new shows — it could alter an American cultural cycle that extends all the way back to the days of radio, when families gathered around the Philco every September, as the school year began, to sample the new entertainment choices.
NBC plans to announce a 52-week schedule in April, a month before ABC and CBS will unveil their fall lineups at splashy presentations known as upfronts. The decision means that NBC will be committing to a new lineup of shows earlier than any of its competitors, while also inviting advertisers to build marketing plans around specific shows and perhaps to integrate brands and products into the plots of the shows themselves.
The fall television season has been under assault on many fronts, from the many cable channels that introduce new shows whenever they find it convenient, to individual series like ABC’s “Grey’s Anatomy” that made their debuts in odd months like March.
Viewers are accustomed to a spring lineup from Fox, for instance, and for fresh slates of reality shows during the summer.
But the move by NBC Universal represents a particularly bold stroke by a network with the size and clout to move markets. After it announces a list of programs in April, NBC plans to meet with big advertising clients in several cities, followed by a different sort of presentation in May that will encompass all the NBC Universal properties, including cable channels like Bravo, USA and CNBC.
What that event will not include is a special introduction of the fall prime-time schedule, which NBC has held for years in Radio City Music Hall and as its broadcast network competitors still intend to do this year. NBC is looking for a different site for the presentation because the Music Hall is not appropriate for the plans it has for that day. But the day will include an introduction of the yearlong programming plans for the press as well as a party for advertising clients that will include some NBC stars.
The idea of a 52-week schedule is not really new. Most networks have programming scattered throug the year with specific shows set aside for summer, like “Big Brother” on CBS (though it was used in the regular season this year because of the strike) and others for midyear like “24” and “American Idol” on Fox and “Lost” on ABC.
But NBC intends to give advertisers a much earlier look at its plans for the entire year. That will presumably make it easier to match advertisers to specific shows, an idea that is growing in popularity. Networks are looking for ways to keep clients paying, even as ratings diminish and programs are replayed on digital recorders with commercials skipped.
One potential benefit of the change, according to Gene DeWitt, chairman of DeWitt Media in New York, is a solution to advertisers’ annual quandary. The last three months of the year are the most important for many marketers — particularly retailers and automakers — but under the current system many of the broadcast shows they are offered then are new and untested. If more shows are brought out earlier in the calendar year, he said, “you’d have a track record of their performance.” “We’d have more reliable rating information,” he added, “so we won’t be going into the fourth quarter blind.” A 52-week broadcast schedule may make it more difficult to track the hits and flops, Mr. DeWitt said, but “it’s the way of the world today — things move faster, and we all have to keep up.” (info from The New York Times)
NBC Universal took a big step toward undoing one of the television industry’s oldest traditions by announcing Tuesday that it would move to a year-round schedule of staggered program introductions. The move is intended to appeal to advertisers, who crave fresh content to keep viewers tuned in.
And if it succeeds — and leads other broadcast networks to shift from their focus on a mass introduction of new shows — it could alter an American cultural cycle that extends all the way back to the days of radio, when families gathered around the Philco every September, as the school year began, to sample the new entertainment choices.
NBC plans to announce a 52-week schedule in April, a month before ABC and CBS will unveil their fall lineups at splashy presentations known as upfronts. The decision means that NBC will be committing to a new lineup of shows earlier than any of its competitors, while also inviting advertisers to build marketing plans around specific shows and perhaps to integrate brands and products into the plots of the shows themselves.
The fall television season has been under assault on many fronts, from the many cable channels that introduce new shows whenever they find it convenient, to individual series like ABC’s “Grey’s Anatomy” that made their debuts in odd months like March.
Viewers are accustomed to a spring lineup from Fox, for instance, and for fresh slates of reality shows during the summer.
But the move by NBC Universal represents a particularly bold stroke by a network with the size and clout to move markets. After it announces a list of programs in April, NBC plans to meet with big advertising clients in several cities, followed by a different sort of presentation in May that will encompass all the NBC Universal properties, including cable channels like Bravo, USA and CNBC.
What that event will not include is a special introduction of the fall prime-time schedule, which NBC has held for years in Radio City Music Hall and as its broadcast network competitors still intend to do this year. NBC is looking for a different site for the presentation because the Music Hall is not appropriate for the plans it has for that day. But the day will include an introduction of the yearlong programming plans for the press as well as a party for advertising clients that will include some NBC stars.
The idea of a 52-week schedule is not really new. Most networks have programming scattered throug the year with specific shows set aside for summer, like “Big Brother” on CBS (though it was used in the regular season this year because of the strike) and others for midyear like “24” and “American Idol” on Fox and “Lost” on ABC.
But NBC intends to give advertisers a much earlier look at its plans for the entire year. That will presumably make it easier to match advertisers to specific shows, an idea that is growing in popularity. Networks are looking for ways to keep clients paying, even as ratings diminish and programs are replayed on digital recorders with commercials skipped.
One potential benefit of the change, according to Gene DeWitt, chairman of DeWitt Media in New York, is a solution to advertisers’ annual quandary. The last three months of the year are the most important for many marketers — particularly retailers and automakers — but under the current system many of the broadcast shows they are offered then are new and untested. If more shows are brought out earlier in the calendar year, he said, “you’d have a track record of their performance.” “We’d have more reliable rating information,” he added, “so we won’t be going into the fourth quarter blind.” A 52-week broadcast schedule may make it more difficult to track the hits and flops, Mr. DeWitt said, but “it’s the way of the world today — things move faster, and we all have to keep up.” (info from The New York Times)
Thursday, February 21, 2008
2009: end of US embargo on Cuba
In the inaugural address in January 2009, incoming President Bararry Clintobama called for a prompt end to the unilateral embargo of trade with Cuba that dates back to the Cold War in 1962.
The embargo was imposed after Cuba siezed properties of American people and businesses, particularly United Fruit and ITT.
The embargo was codified into law in 1992 with the stated purpose of "bringing democracy to the Cuban people", and is entitled the Cuban Democracy Act. In 1996 Congress passed the Helms-Burton Act which further restricted US citizens from doing business in or with Cuba, and mandated restrictions on giving public or private assistance to any successor regime in Havana unless and until certain claims against the Cuban government are met.
In 1999, President Bill Clinton expanded the trade embargo even further by ending the practice of foreign subsidiaries of US companies trading with Cuba in dollar amounts totaling more than $700 million a year.
The embargo was one of the few times in history that US citizens were restricted from doing business abroad, and was the most enduring trade embargo in modern history. Despite the existence of the embargo, the US is the seventh largest exporter to Cuba (4.3% of Cuba's imports are from the US).
T he political elite in Washington privately acknowledge that the embargo is a failure. Publicly, they defend it because of fears that the Cuban American community, concentrated in politically powerful Florida, will vote against them. In October 2007, President Bush reiterated his commitment to it in a speech to Cuban dissidents, and none of the leading presidential candidates for the 2008 election called for ending the embargo.
The policy was useless as a tool for beating the recently retired Fidel Castro, and hindered opportunities for American industries from travel to banking to agriculture, and many US business groups have lobbied to end it.
Far from hurting the Communist regime, the embargo gave Castro an excuse to rail against the US to his own people and to the world. Every year, Cuba asked the United Nations for a vote to lift the embargo. In 2007, the vote was 183 to 4 against the US.
The embargo made the good old USA look like an arrogant bully. In the early days of the cold war, the US pressured other countries to help isolate Castro by severing trade ties; but the other countries eventually gave. That's why you could buy a fresh Cuban cigar almost anywhere but in the US.
Sanctions are hard to enforce when the world agrees on them, as with Saddam Hussein's Iraq. With Cuba it was an embargo of one. Italian phone companies, French hotels, and Korean automakers were quite happy to trade with an island just 90 miles from Florida.
Cuba is not a huge market. Its population is only 11 million and its G.D.P. only $46 billion. By comparison, Vietnam, the last Communist country where the US had an embargo, has 85 million people and a G.D.P. of $262 billion. Selling to Cuba won't slash the US trade deficit, but it won't hurt. (some info from Wikipedia and Portfolio)
The embargo was imposed after Cuba siezed properties of American people and businesses, particularly United Fruit and ITT.
The embargo was codified into law in 1992 with the stated purpose of "bringing democracy to the Cuban people", and is entitled the Cuban Democracy Act. In 1996 Congress passed the Helms-Burton Act which further restricted US citizens from doing business in or with Cuba, and mandated restrictions on giving public or private assistance to any successor regime in Havana unless and until certain claims against the Cuban government are met.
In 1999, President Bill Clinton expanded the trade embargo even further by ending the practice of foreign subsidiaries of US companies trading with Cuba in dollar amounts totaling more than $700 million a year.
The embargo was one of the few times in history that US citizens were restricted from doing business abroad, and was the most enduring trade embargo in modern history. Despite the existence of the embargo, the US is the seventh largest exporter to Cuba (4.3% of Cuba's imports are from the US).
T he political elite in Washington privately acknowledge that the embargo is a failure. Publicly, they defend it because of fears that the Cuban American community, concentrated in politically powerful Florida, will vote against them. In October 2007, President Bush reiterated his commitment to it in a speech to Cuban dissidents, and none of the leading presidential candidates for the 2008 election called for ending the embargo.
The policy was useless as a tool for beating the recently retired Fidel Castro, and hindered opportunities for American industries from travel to banking to agriculture, and many US business groups have lobbied to end it.
Far from hurting the Communist regime, the embargo gave Castro an excuse to rail against the US to his own people and to the world. Every year, Cuba asked the United Nations for a vote to lift the embargo. In 2007, the vote was 183 to 4 against the US.
The embargo made the good old USA look like an arrogant bully. In the early days of the cold war, the US pressured other countries to help isolate Castro by severing trade ties; but the other countries eventually gave. That's why you could buy a fresh Cuban cigar almost anywhere but in the US.
Sanctions are hard to enforce when the world agrees on them, as with Saddam Hussein's Iraq. With Cuba it was an embargo of one. Italian phone companies, French hotels, and Korean automakers were quite happy to trade with an island just 90 miles from Florida.
Cuba is not a huge market. Its population is only 11 million and its G.D.P. only $46 billion. By comparison, Vietnam, the last Communist country where the US had an embargo, has 85 million people and a G.D.P. of $262 billion. Selling to Cuba won't slash the US trade deficit, but it won't hurt. (some info from Wikipedia and Portfolio)
Wednesday, February 20, 2008
2000: first president elected by one voter
The 2000 presidential race between Democrat Al Gore and Republican George Dubya Bush was close and complicated. Equipment problems, particularly Florida's notorious "hanging chads," resulted in a close and shifting tally.
Exit polls showed the election won by Gore, then Bush. Many people went to bed thinking that Gore had won, only to discover in the morning that Bush had been declared the winner. The election process went on for weeks. It was simply too close to call. Several states were up for grabs, but in the end it came down to Florida, where Bush's younger brother, Jeb, was governor.
Florida electors were unable to commit themselves to either Bush or Gore because of the closeness of the vote. Tirades erupted in several precincts where the candidates' backers traded accusations about improprieties, such as race-based voter rejection and confusing ballots and equipment.
In a small precinct made up largely of Jewish Democrats from New York and New Jersey, Patrick Buchanan, who was perceived by many Jews as an anti-Semite, somehow took 47 votes for president. Some people said they punched a hole on the ballot next to Gore's name that actually registered a vote for Buchanan. There were many who said they knew they mistakenly voted for him but had not realized it until after they cast their ballots. Others said they knew immediately they had made a mistake and punched a second hole in the presidential race, voiding their vote.
And some said they knew they had cast the wrong vote but did nothing because they said they were either embarrassed, ashamed or did not know what to do. Many did ask for a new ballot after mistakenly punching the wrong hole but before inserting the ballot in the box, and they received a new one. But others said they did not know they could.
Of the more than 1,700 votes cast at Precinct 162G, only 47 people, or less than 4 percent, voted for Buchanan, and residents there said that all or almost all of those 47 votes were actually intended for Gore. It is a small percentage in this precinct, but in such a close race every wasted vote seemed to cause great misery.
Recounts were started, then stopped, as Republicans and Democrats wrangled over what standards to apply. It was more than a little chaotic. The closeness of the outcome, as well as reports of votes being miscounted, led to the Florida election recount.
Two initial recounts went to Bush, but that outcome was tied up in courts for a month until reaching the US Supreme Court. On December 9 the Court reversed a Florida Supreme Court ruling ordering a third count, and stopped an ordered statewide hand recount based on the argument that the different standards that different counting procedures would have used violated the Equal Protection Clause of the Fourteenth Amendment. The machine recount stated that Bush had won the Florida vote by a margin of 537 votes out of six million cast.
Bush received 271 electoral votes to Gore's 266 as a result of the Florida outcome. However, he lost the popular vote by more than half a million votes making him the first president elected without at least a plurality of the popular vote since Benjamin Harrison in 1888.
The US Supreme Court voted five to four to send the case back to the Florida Supreme Court, which had no alternative but to dismiss it. The presidential election of 2000 had been decided, in essence, by the vote of one Supreme Court justice.
Needless to say, Bush supporters were jubilant and Gore supporters were incensed. Many people were simply happy to have things settled, but others worried that the Court had gone too far. (info from the New York Times, Wikipedia & Dummies.com)
Exit polls showed the election won by Gore, then Bush. Many people went to bed thinking that Gore had won, only to discover in the morning that Bush had been declared the winner. The election process went on for weeks. It was simply too close to call. Several states were up for grabs, but in the end it came down to Florida, where Bush's younger brother, Jeb, was governor.
Florida electors were unable to commit themselves to either Bush or Gore because of the closeness of the vote. Tirades erupted in several precincts where the candidates' backers traded accusations about improprieties, such as race-based voter rejection and confusing ballots and equipment.
In a small precinct made up largely of Jewish Democrats from New York and New Jersey, Patrick Buchanan, who was perceived by many Jews as an anti-Semite, somehow took 47 votes for president. Some people said they punched a hole on the ballot next to Gore's name that actually registered a vote for Buchanan. There were many who said they knew they mistakenly voted for him but had not realized it until after they cast their ballots. Others said they knew immediately they had made a mistake and punched a second hole in the presidential race, voiding their vote.
And some said they knew they had cast the wrong vote but did nothing because they said they were either embarrassed, ashamed or did not know what to do. Many did ask for a new ballot after mistakenly punching the wrong hole but before inserting the ballot in the box, and they received a new one. But others said they did not know they could.
Of the more than 1,700 votes cast at Precinct 162G, only 47 people, or less than 4 percent, voted for Buchanan, and residents there said that all or almost all of those 47 votes were actually intended for Gore. It is a small percentage in this precinct, but in such a close race every wasted vote seemed to cause great misery.
Recounts were started, then stopped, as Republicans and Democrats wrangled over what standards to apply. It was more than a little chaotic. The closeness of the outcome, as well as reports of votes being miscounted, led to the Florida election recount.
Two initial recounts went to Bush, but that outcome was tied up in courts for a month until reaching the US Supreme Court. On December 9 the Court reversed a Florida Supreme Court ruling ordering a third count, and stopped an ordered statewide hand recount based on the argument that the different standards that different counting procedures would have used violated the Equal Protection Clause of the Fourteenth Amendment. The machine recount stated that Bush had won the Florida vote by a margin of 537 votes out of six million cast.
Bush received 271 electoral votes to Gore's 266 as a result of the Florida outcome. However, he lost the popular vote by more than half a million votes making him the first president elected without at least a plurality of the popular vote since Benjamin Harrison in 1888.
The US Supreme Court voted five to four to send the case back to the Florida Supreme Court, which had no alternative but to dismiss it. The presidential election of 2000 had been decided, in essence, by the vote of one Supreme Court justice.
Needless to say, Bush supporters were jubilant and Gore supporters were incensed. Many people were simply happy to have things settled, but others worried that the Court had gone too far. (info from the New York Times, Wikipedia & Dummies.com)
Tuesday, February 19, 2008
2008: Fidel Castro resigns
Ailing and elderly Cuban President Fidel Castro resigned early Tuesday, saying in a letter published in official online media that he wouldn't accept a new term when the newly elected parliament meets on Sunday.
"I will not aspire nor accept, the post of President of the Council of State and Commander in Chief," read a letter signed by Castro published quietly overnight without advance warning in the online edition of the Communist Party daily Granma.
The new National Assembly is meeting for first time Sunday since January elections to pick the governing Council of State, including the presidency Castro holds. There had been wide speculation about whether he would accept a nomination for re-election to that post or retire.
The 81-year-old's overnight announcement effectively ends his rule of almost 50 years over Cuba, positioning his 76-year-old brother, Raul, for permanent succession to the presidency.
Over the decades, the fiery guerrilla leader reshaped Cuba into a communist state 90 miles from U.S. shores and survived assassination attempts, a CIA-backed invasion and a missile crisis that brought the world to the brink of nuclear war. Since his rise to power on New Year's Day 1959, Mr. Castro resisted attempts by 10 US administrations to topple him, including the disastrous Bay of Pigs invasion in 1961.
The US discovery of nuclear-armed missiles on the island led to a showdown of the world's then-superpowers before the Soviet Union agreed to remove them.
Monarchs excepted, Castro was the world's longest ruling head of state. His ironclad rule ensured Cuba remained among the world's five last remaining communist countries, long after the breakup of the Soviet Union and collapse of communism across Eastern Europe.
Castro's designated successor was his brother, Raul, five years younger and No. 2 in Cuba's power structure as defense minister. Raul Castro had been in his brother's rebel movements since 1953.
Castro had already temporarily ceded his powers to his brother on July 31, 2006, when he announced that he had undergone intestinal surgery. More than a year after falling ill, the elder Castro still hadn't been seen in public, appearing only sporadically in official photographs and videotapes and publishing dense essays about mostly international themes as his younger brother began to consolidate his rule.
But the US, bent on blocking Castro's plans for his younger brother to succeed him, built a detailed plan in 2005 for American assistance to ensure a democratic transition on the island of 11.2 million people after his death. Castro and other Cuban officials long insisted "there will be no transition" and that the island's socialist political and economic systems will live on long after he is gone.
Castro's supporters admired his ability to provide a high level of health care and education for citizens while remaining fully independent of the US But his detractors called him a dictator whose totalitarian government denied individual freedoms and civil liberties such as speech, movement and assembly. (info from The Wall Street Journal)
"I will not aspire nor accept, the post of President of the Council of State and Commander in Chief," read a letter signed by Castro published quietly overnight without advance warning in the online edition of the Communist Party daily Granma.
The new National Assembly is meeting for first time Sunday since January elections to pick the governing Council of State, including the presidency Castro holds. There had been wide speculation about whether he would accept a nomination for re-election to that post or retire.
The 81-year-old's overnight announcement effectively ends his rule of almost 50 years over Cuba, positioning his 76-year-old brother, Raul, for permanent succession to the presidency.
Over the decades, the fiery guerrilla leader reshaped Cuba into a communist state 90 miles from U.S. shores and survived assassination attempts, a CIA-backed invasion and a missile crisis that brought the world to the brink of nuclear war. Since his rise to power on New Year's Day 1959, Mr. Castro resisted attempts by 10 US administrations to topple him, including the disastrous Bay of Pigs invasion in 1961.
The US discovery of nuclear-armed missiles on the island led to a showdown of the world's then-superpowers before the Soviet Union agreed to remove them.
Monarchs excepted, Castro was the world's longest ruling head of state. His ironclad rule ensured Cuba remained among the world's five last remaining communist countries, long after the breakup of the Soviet Union and collapse of communism across Eastern Europe.
Castro's designated successor was his brother, Raul, five years younger and No. 2 in Cuba's power structure as defense minister. Raul Castro had been in his brother's rebel movements since 1953.
Castro had already temporarily ceded his powers to his brother on July 31, 2006, when he announced that he had undergone intestinal surgery. More than a year after falling ill, the elder Castro still hadn't been seen in public, appearing only sporadically in official photographs and videotapes and publishing dense essays about mostly international themes as his younger brother began to consolidate his rule.
But the US, bent on blocking Castro's plans for his younger brother to succeed him, built a detailed plan in 2005 for American assistance to ensure a democratic transition on the island of 11.2 million people after his death. Castro and other Cuban officials long insisted "there will be no transition" and that the island's socialist political and economic systems will live on long after he is gone.
Castro's supporters admired his ability to provide a high level of health care and education for citizens while remaining fully independent of the US But his detractors called him a dictator whose totalitarian government denied individual freedoms and civil liberties such as speech, movement and assembly. (info from The Wall Street Journal)
Monday, February 18, 2008
1971: first Presidents Day on Monday
Presidents Day (or Presidents' Day), is the common name for the federal holiday officially designated as Washington's Birthday. It is celebrated on the third Monday of February.
As the official title of the federal holiday, Washington's Birthday was originally implemented by the US federal government in 1880 in the District of Columbia and expanded in 1885 to include all federal offices. As the first federal holiday to honor an American citizen, the holiday was celebrated on Washington's actual birthday, February 22.
In 1971 the federal holiday was shifted to the third Monday in February by the Uniform Monday Holiday Act. A draft of the Uniform Holidays Bill of 1968 would have renamed the holiday to Presidents' Day to honor both Washington and Lincoln, but when signed into law in 1968 simply moved Washington's Birthday.
In the late 1980s, with a push from advertisers, the term Presidents Day began its public appearance. The theme has expanded the focus of the holiday to honor another President born in February, Abraham Lincoln, and often other Presidents of the United States. Although Lincoln's birthday, February 12, was never a federal holiday, approximately a dozen state governments have officially renamed their Washington's Birthday observances as "Presidents Day", "Washington and Lincoln Day", or other such designations. It is also interesting to note that "Presidents Day" is not always an all-inclusive term. In Massachusetts, while the state officially celebrates "Washington's Birthday," state law also prescribes that the governor issue an annual Presidents Day proclamation honoring the presidents that have come from Massachusetts: John Adams, John Quincy Adams, Calvin Coolidge, and John F. Kennedy. (Coolidge, the only one born outside of Massachusetts, spent his entire political career before the vice presidency there. George H.W. Bush, on the other hand, was born in Massachusetts, but has spent most of his life elsewhere.)
Alabama uniquely observes the day as "Washington and Jefferson Day," even though Jefferson's birthday was in April. In Connecticut, while Presidents Day is a federal holiday, Abraham Lincoln's birthday is still a state holiday, falling on February 12 regardless of the day of the week.
In Washington's home state of Virginia the holiday is legally known as "George Washington Day.
The holiday is now known as a day when many stores hold sales. Until the late 1980s, corporate offices were mostly closed, as on Memorial Day or Christmas. With the late 1980s advertising push to rename the holiday, more and more businesses are staying open on the holiday each year, and, as on Veterans Day and Columbus Day, most delivery services outside of the Post Office now offer regular service. Some public transit systems have also gone to regular schedules on the day.
Various theories exist for this, one accepted reason being to make up for the growing trend of corporations to close in observance of the Birthday of Martin Luther King, Jr. However, when reviewing the Uniform Monday Holiday Bill debate of 1968 in the Congressional Record, one notes that supporters of the Bill were intent on moving federal holidays to Mondays to promote business. Over time, as with many federal holidays, few Americans actually celebrate Presidents Day, and it is mainly known as a day off from work or school, although most non-governmental workers do not get the day off.
Consequently, some schools, which used to close for a single day for both Lincoln's and Washington's birthday, now often close for the entire week (beginning with the Monday holiday) as a "mid-winter recess". For example, the New York City school district began doing so in the late 1990s.
The federal holiday Washington's Birthday is intended to honor the accomplishments of the man who has been referred to, for over two centuries, as "The Father of his Country". Celebrated for his leadership in the founding of the nation, he was the Electoral College's unanimous choice to become the first President; he was seen as a unifying force for the new republic and set an example for future holders of the office. (info & photo from Wikipedia)
As the official title of the federal holiday, Washington's Birthday was originally implemented by the US federal government in 1880 in the District of Columbia and expanded in 1885 to include all federal offices. As the first federal holiday to honor an American citizen, the holiday was celebrated on Washington's actual birthday, February 22.
In 1971 the federal holiday was shifted to the third Monday in February by the Uniform Monday Holiday Act. A draft of the Uniform Holidays Bill of 1968 would have renamed the holiday to Presidents' Day to honor both Washington and Lincoln, but when signed into law in 1968 simply moved Washington's Birthday.
In the late 1980s, with a push from advertisers, the term Presidents Day began its public appearance. The theme has expanded the focus of the holiday to honor another President born in February, Abraham Lincoln, and often other Presidents of the United States. Although Lincoln's birthday, February 12, was never a federal holiday, approximately a dozen state governments have officially renamed their Washington's Birthday observances as "Presidents Day", "Washington and Lincoln Day", or other such designations. It is also interesting to note that "Presidents Day" is not always an all-inclusive term. In Massachusetts, while the state officially celebrates "Washington's Birthday," state law also prescribes that the governor issue an annual Presidents Day proclamation honoring the presidents that have come from Massachusetts: John Adams, John Quincy Adams, Calvin Coolidge, and John F. Kennedy. (Coolidge, the only one born outside of Massachusetts, spent his entire political career before the vice presidency there. George H.W. Bush, on the other hand, was born in Massachusetts, but has spent most of his life elsewhere.)
Alabama uniquely observes the day as "Washington and Jefferson Day," even though Jefferson's birthday was in April. In Connecticut, while Presidents Day is a federal holiday, Abraham Lincoln's birthday is still a state holiday, falling on February 12 regardless of the day of the week.
In Washington's home state of Virginia the holiday is legally known as "George Washington Day.
The holiday is now known as a day when many stores hold sales. Until the late 1980s, corporate offices were mostly closed, as on Memorial Day or Christmas. With the late 1980s advertising push to rename the holiday, more and more businesses are staying open on the holiday each year, and, as on Veterans Day and Columbus Day, most delivery services outside of the Post Office now offer regular service. Some public transit systems have also gone to regular schedules on the day.
Various theories exist for this, one accepted reason being to make up for the growing trend of corporations to close in observance of the Birthday of Martin Luther King, Jr. However, when reviewing the Uniform Monday Holiday Bill debate of 1968 in the Congressional Record, one notes that supporters of the Bill were intent on moving federal holidays to Mondays to promote business. Over time, as with many federal holidays, few Americans actually celebrate Presidents Day, and it is mainly known as a day off from work or school, although most non-governmental workers do not get the day off.
Consequently, some schools, which used to close for a single day for both Lincoln's and Washington's birthday, now often close for the entire week (beginning with the Monday holiday) as a "mid-winter recess". For example, the New York City school district began doing so in the late 1990s.
The federal holiday Washington's Birthday is intended to honor the accomplishments of the man who has been referred to, for over two centuries, as "The Father of his Country". Celebrated for his leadership in the founding of the nation, he was the Electoral College's unanimous choice to become the first President; he was seen as a unifying force for the new republic and set an example for future holders of the office. (info & photo from Wikipedia)
Friday, February 15, 2008
1822: Catholic Church admits that Earth is not center of universe
In 1610, Italian astronomer Galileo Galilei built a telescope to observe the solar system, and deduced that planets orbit the sun, not the earth.
This contradicted Church teachings, and some of the clergy accused Galileo of heresy. (In 1600, Giordano Bruno was convicted of being a heretic for believing that the earth moved around the sun, and that there were many planets throughout the universe where life existed. Bruno was burnt to death.)
Galileo moved on to other projects. He started writing about ocean tides, but instead of writing a scientific paper, he found it much more interesting to have an conversation among three fictional characters. One character, who would support Galileo's side of the argument, was brilliant. Another character would be open to either side of the argument. The final character, named Simplicio, was dogmatic and foolish, representing Galileo's enemies who ignored any evidence that Galileo was right. Soon, Galileo wrote a similar dialogue called "Dialogue on the Two Great Systems of the World," about the Copernican system.
"Dialogue" was an immediate hit with the public, but not with the Church. The pope suspected that he was the model for Simplicio. He ordered the book banned, and also ordered Galileo to appear before the Inquisition in Rome for the crime of teaching the Copernican theory after being ordered not to do so.
Galileo was 68 years old and sick. Threatened with torture, he publicly confessed that he had been wrong to have said that the Earth moves around the Sun. Legend has it that after his confession, Galileo whispered "And yet, it moves."
Galileo was allowed to live under house arrest. Until his death in 1642, he continued to investigate science, and even published a book on force and motion after he had become blind.
The Church eventually lifted the ban on Galileo's Dialogue in 1822, when it was common knowledge that the Earth was not the center of the Universe. Still later, there were statements by the Vatican Council in the early 1960's and in 1979 that implied that Galileo was pardoned, and that he had suffered at the hands of the Church. Finally, in 1992, three years after Galileo Galilei's namesake spacecraft had been launched on its way to Jupiter, the Vatican formally and publicly cleared Galileo of any wrongdoing. (info from NASA and the History Channel) (portrait by Justus Sustermans painted in 1636)
This contradicted Church teachings, and some of the clergy accused Galileo of heresy. (In 1600, Giordano Bruno was convicted of being a heretic for believing that the earth moved around the sun, and that there were many planets throughout the universe where life existed. Bruno was burnt to death.)
Galileo moved on to other projects. He started writing about ocean tides, but instead of writing a scientific paper, he found it much more interesting to have an conversation among three fictional characters. One character, who would support Galileo's side of the argument, was brilliant. Another character would be open to either side of the argument. The final character, named Simplicio, was dogmatic and foolish, representing Galileo's enemies who ignored any evidence that Galileo was right. Soon, Galileo wrote a similar dialogue called "Dialogue on the Two Great Systems of the World," about the Copernican system.
"Dialogue" was an immediate hit with the public, but not with the Church. The pope suspected that he was the model for Simplicio. He ordered the book banned, and also ordered Galileo to appear before the Inquisition in Rome for the crime of teaching the Copernican theory after being ordered not to do so.
Galileo was 68 years old and sick. Threatened with torture, he publicly confessed that he had been wrong to have said that the Earth moves around the Sun. Legend has it that after his confession, Galileo whispered "And yet, it moves."
Galileo was allowed to live under house arrest. Until his death in 1642, he continued to investigate science, and even published a book on force and motion after he had become blind.
The Church eventually lifted the ban on Galileo's Dialogue in 1822, when it was common knowledge that the Earth was not the center of the Universe. Still later, there were statements by the Vatican Council in the early 1960's and in 1979 that implied that Galileo was pardoned, and that he had suffered at the hands of the Church. Finally, in 1992, three years after Galileo Galilei's namesake spacecraft had been launched on its way to Jupiter, the Vatican formally and publicly cleared Galileo of any wrongdoing. (info from NASA and the History Channel) (portrait by Justus Sustermans painted in 1636)
Thursday, February 14, 2008
1957: first independent black sub-Sahara nation
Gold Coast was a British colony on the Gulf of Guinea in west Africa that became the independent nation of Ghana in 1957.
The first Europeans to arrive at the coast were the Portuguese, in 1471. They encountered a variety of African kingdoms some of whom controlled substantial deposits of gold. In 1482, the Portuguese built the Castle of Elmina, the first European settlement on the Gold Coast, where they traded slaves, gold, knives, beads, mirrors, rum and guns.
Eventually, English, Dutch, Danish, German and Swedish traders arrived and built forts along the coastline. The Gold Coast was formed in 1821 when the British government seized privately held lands along the coast. Gold Coast had long been a name for the region used by Europeans, due to the large gold resources to be found in the area, although slave trade was the principal exchange for a number of years. In 1872, the Dutch lost interest in the coast and gave up their forts to the British.
Britain steadily expanded the colony through the invasion of local kingdoms, the Ashanti Confederacy and other European countries which had colonies in the region.
Britain's main problem was the Ashanti people, who controlled much of Ghana before the Europeans arrived and are still today the biggest community in Ghana. During the First Anglo-Ashanti war (1863-1864) the two groups fought because of a disagreement over an Ashanti chief and slavery.
Tensions increased in 1874 during the Second Asanti War (1873-1874) when the British sacked the Ashanti capital of Kumasi. The third Asanti War(1893-1894) occurred because the new Ashanti ruler wanted to use his power. From 1895-1896 the British and Ashanti fought in the fourth and final Ashanti War, where the Ashanti fought for and lost their independence.
In 1900 the Ashanti Uprising occurred and resulted in the Ashanti capture and, shortly after, loss of Kumasi. This was due to an attempt to steal the Ashanti throne. At the end of this last Ashanti War, the Ashanti people became a protectorate in 1902.
By 1901, all of the Gold Coast was a British colony, with its kingdoms and tribes forming a single unit. Various natural resources — such as gold, metal ores, diamonds, ivory, pepper, timber, corn and cocoa — were shipped from the Gold Coast by the British. The British coloniserz built railways and a complex transport infrastructure which formed the basis for the transport infrastructure in modern-day Ghana. Western hospitals and schools were also built, an attempt by the British to export what were then modern day amenities to the people of the Empire.
However, by 1945, demands for more autonomy by the Gold Coast population were beginning to arise, in the wake of the end of the Second World War and the beginnings of the decolonisation process across the world.
By 1956, British Togoland, the Ashanti protectorate, and the Fante protectorate were merged with the Gold Coast to create one colony, which became known as the Gold Coast. In 1957 the colony gained independence under the name of Ghana, the first independent black sub-Sahara nation on the continent.
After a chekered independent history, punctuated by a spate of military takeovers known locally as "booms", Ghana stands as one of Africa's most respected democracies - an enviable status in a region racked by coups and wars. As Ghana turned independent, the first president Kwame Nkrumah told his fellow citizens: "Ghana, your beloved country, is free forever."
Many Ghanaians see peace as the country's greatest achievement today - that and the fact their country gave the world former United Nations Secretary-General Kofi Annan.
Ghana's 1957 breakout from colonialism triggered a wave of independence movements and liberation struggles that changed the map of the African continent. In less than two decades, the patchwork of colonial dominions carved up by European powers at the end of the 19th century became a group of new states.
But Nkrumah's dream for a new Africa, strong, free and prosperous, rapidly turned sour in Ghana as his personalized rule led to persecution of opponents and his profligate spending brought the country's once-rich economy to collapse. He was overthrown in a coup in 1966 - one of a spate of military takeovers that rippled across the continent and stained independence dreams with bloodshed.
Ghana stumbled from coup to coup until Jerry Rawlings, himself a leader of two "booms" in 1979 and 1981, restored democratic elections in 1992. Present President John Kufuor, who was elected in 2000, is due to stand down at elections in 2008.
For many Ghana citizans, the nation's greatest failing has been the slow pace of economic development. Some do not have water and electricity at home.
Compared with many struggling West African economies, Ghana can boast steady growth, low inflation, and rising gold and cocoa output that has attracted foreign investors. But its achievements pale when compared with South Asian economies, such as Malaysia and South Korea, that it saw as equals in the 1950s. (info from Wikipedia and freedominion.ca)
The first Europeans to arrive at the coast were the Portuguese, in 1471. They encountered a variety of African kingdoms some of whom controlled substantial deposits of gold. In 1482, the Portuguese built the Castle of Elmina, the first European settlement on the Gold Coast, where they traded slaves, gold, knives, beads, mirrors, rum and guns.
Eventually, English, Dutch, Danish, German and Swedish traders arrived and built forts along the coastline. The Gold Coast was formed in 1821 when the British government seized privately held lands along the coast. Gold Coast had long been a name for the region used by Europeans, due to the large gold resources to be found in the area, although slave trade was the principal exchange for a number of years. In 1872, the Dutch lost interest in the coast and gave up their forts to the British.
Britain steadily expanded the colony through the invasion of local kingdoms, the Ashanti Confederacy and other European countries which had colonies in the region.
Britain's main problem was the Ashanti people, who controlled much of Ghana before the Europeans arrived and are still today the biggest community in Ghana. During the First Anglo-Ashanti war (1863-1864) the two groups fought because of a disagreement over an Ashanti chief and slavery.
Tensions increased in 1874 during the Second Asanti War (1873-1874) when the British sacked the Ashanti capital of Kumasi. The third Asanti War(1893-1894) occurred because the new Ashanti ruler wanted to use his power. From 1895-1896 the British and Ashanti fought in the fourth and final Ashanti War, where the Ashanti fought for and lost their independence.
In 1900 the Ashanti Uprising occurred and resulted in the Ashanti capture and, shortly after, loss of Kumasi. This was due to an attempt to steal the Ashanti throne. At the end of this last Ashanti War, the Ashanti people became a protectorate in 1902.
By 1901, all of the Gold Coast was a British colony, with its kingdoms and tribes forming a single unit. Various natural resources — such as gold, metal ores, diamonds, ivory, pepper, timber, corn and cocoa — were shipped from the Gold Coast by the British. The British coloniserz built railways and a complex transport infrastructure which formed the basis for the transport infrastructure in modern-day Ghana. Western hospitals and schools were also built, an attempt by the British to export what were then modern day amenities to the people of the Empire.
However, by 1945, demands for more autonomy by the Gold Coast population were beginning to arise, in the wake of the end of the Second World War and the beginnings of the decolonisation process across the world.
By 1956, British Togoland, the Ashanti protectorate, and the Fante protectorate were merged with the Gold Coast to create one colony, which became known as the Gold Coast. In 1957 the colony gained independence under the name of Ghana, the first independent black sub-Sahara nation on the continent.
After a chekered independent history, punctuated by a spate of military takeovers known locally as "booms", Ghana stands as one of Africa's most respected democracies - an enviable status in a region racked by coups and wars. As Ghana turned independent, the first president Kwame Nkrumah told his fellow citizens: "Ghana, your beloved country, is free forever."
Many Ghanaians see peace as the country's greatest achievement today - that and the fact their country gave the world former United Nations Secretary-General Kofi Annan.
Ghana's 1957 breakout from colonialism triggered a wave of independence movements and liberation struggles that changed the map of the African continent. In less than two decades, the patchwork of colonial dominions carved up by European powers at the end of the 19th century became a group of new states.
But Nkrumah's dream for a new Africa, strong, free and prosperous, rapidly turned sour in Ghana as his personalized rule led to persecution of opponents and his profligate spending brought the country's once-rich economy to collapse. He was overthrown in a coup in 1966 - one of a spate of military takeovers that rippled across the continent and stained independence dreams with bloodshed.
Ghana stumbled from coup to coup until Jerry Rawlings, himself a leader of two "booms" in 1979 and 1981, restored democratic elections in 1992. Present President John Kufuor, who was elected in 2000, is due to stand down at elections in 2008.
For many Ghana citizans, the nation's greatest failing has been the slow pace of economic development. Some do not have water and electricity at home.
Compared with many struggling West African economies, Ghana can boast steady growth, low inflation, and rising gold and cocoa output that has attracted foreign investors. But its achievements pale when compared with South Asian economies, such as Malaysia and South Korea, that it saw as equals in the 1950s. (info from Wikipedia and freedominion.ca)
Wednesday, February 13, 2008
1865: slaves freed (late) in Texas
With the arrival of an Army ship in Galveston on June 19, 1865, Texas was the last state to learn that the South had surrendered two months earlier. More than two years after the Emancipation Proclamation went into effect on Jan. 1, 1863, the 250,000 slaves in Texas were finally freed.
The date in June is commemorated as Juneteenth, which is traditionally celebrated on the third Saturday in June. It began taking root across the country largely because of enthusiastic black "Texpats" in other states, like like Joe Kings, a retired Army medical administrator who spent 11 years stationed at Fort Hood in Texas. After buying a business in Portland, Maine, he held a Juneteenth picnic the very first year.
"Even the black people here didn't know about Juneteenth," Mr. Kings said. "Now the white ladies come by on the first of June and start asking: 'When's Juneteenth?'"
With its lighthearted name and tragicomic origins, Juneteenth appeals to many Americans by celebrating the end of slavery without dwelling on its legacy. Juneteenth, its celebrators say, is Martin Luther King's Birthday without the grieving. (info from The New York Times)
The date in June is commemorated as Juneteenth, which is traditionally celebrated on the third Saturday in June. It began taking root across the country largely because of enthusiastic black "Texpats" in other states, like like Joe Kings, a retired Army medical administrator who spent 11 years stationed at Fort Hood in Texas. After buying a business in Portland, Maine, he held a Juneteenth picnic the very first year.
"Even the black people here didn't know about Juneteenth," Mr. Kings said. "Now the white ladies come by on the first of June and start asking: 'When's Juneteenth?'"
With its lighthearted name and tragicomic origins, Juneteenth appeals to many Americans by celebrating the end of slavery without dwelling on its legacy. Juneteenth, its celebrators say, is Martin Luther King's Birthday without the grieving. (info from The New York Times)
Tuesday, February 12, 2008
2008: Polaroid stops making film
Polaroid Corp. is shutting its remaining US film plants in March and will stop making film for its venerable instant-picture cameras by the end of the year.
This doesn't necessarily mean Polaroid customers will be completely out of luck, as the company is trying to sell its technology, and a sale could result in a third party making film for Polaroid cameras.
The company has turned its focus to digital, where the film business has migrated over the past decade.
Polaroid filed for bankruptcy protection in 2001 and sold most of its assets and trademark name to the private-equity arm of Bank One in 2002. Bank One sold Polaroid in 2005 to Petters Group Worldwide, a holding company for numerous consumer brands. The name is now used on a variety of products, including TVs and DVD players.
Polaroid has two film plants in Massachusetts, where the company is based, and facilities in Mexico and the Netherlands. The US plants employ about 150 people. The company has already stopped making instant cameras. Various film products have already been carrying stickers indicating they are being discontinued, and there has been some hoarding by customers. (info from The Wall Street Journal)
This doesn't necessarily mean Polaroid customers will be completely out of luck, as the company is trying to sell its technology, and a sale could result in a third party making film for Polaroid cameras.
The company has turned its focus to digital, where the film business has migrated over the past decade.
Polaroid filed for bankruptcy protection in 2001 and sold most of its assets and trademark name to the private-equity arm of Bank One in 2002. Bank One sold Polaroid in 2005 to Petters Group Worldwide, a holding company for numerous consumer brands. The name is now used on a variety of products, including TVs and DVD players.
Polaroid has two film plants in Massachusetts, where the company is based, and facilities in Mexico and the Netherlands. The US plants employ about 150 people. The company has already stopped making instant cameras. Various film products have already been carrying stickers indicating they are being discontinued, and there has been some hoarding by customers. (info from The Wall Street Journal)
Monday, February 11, 2008
2008: jazz CD is Grammy album of the year
Last night Herbie Hancock, not some country twanger or rapper, won the Grammy award in the album of the year category for his tribute to Joni Mitchell, “River: The Joni Letters,” which featured artists including Corinne Bailey Rae and Norah Jones offering gentle renditions of Mitchell’s songs. The album’s unexpected victory represents the first time in 43 years that a jazz album has won the album of the year prize.
“What a beautiful day this is in Los Angeles!” Hancock, a favorite of the National Academy of Recording Arts and Sciences who had won 10 Grammy trophies before Sunday, said as he took the stage.
Though the choice of Hancock may stoke criticism that Grammy voters are out of step with pop music’s cutting edge, the decision was defended backstage. Vince Gill, the country superstar who lost out to Hancock in the album of the year field, said Hancock was “hands-down a better musician than all of us put together.”
Neil Portnow, the president of the academy, which bestows the awards, disputed characterizations of Hancock as irrelevant, saying the competition is based on excellence, not sales. (info from The New York Times)
“What a beautiful day this is in Los Angeles!” Hancock, a favorite of the National Academy of Recording Arts and Sciences who had won 10 Grammy trophies before Sunday, said as he took the stage.
Though the choice of Hancock may stoke criticism that Grammy voters are out of step with pop music’s cutting edge, the decision was defended backstage. Vince Gill, the country superstar who lost out to Hancock in the album of the year field, said Hancock was “hands-down a better musician than all of us put together.”
Neil Portnow, the president of the academy, which bestows the awards, disputed characterizations of Hancock as irrelevant, saying the competition is based on excellence, not sales. (info from The New York Times)
Friday, February 8, 2008
1999: last country gets television
April 2002 was a turbulent month for the people of Bhutan. One of the remotest nations in the world, perched high in the snowlines of the Himalayas, suffered a crime wave. The 700,000 inhabitants of a kingdom that calls itself the Land of the Thunder Dragon had never experienced serious law-breaking before. Yet now there were reports of fraud, violence and murder.
The Bhutanese had always been proud of their incorruptible officials - until Parop Tshering, the chief accountant of the State Trading Corporation, was charged with embezzling. Every aspect of Bhutanese life is steeped in Himalayan Buddhism, and yet thieves vandalized and robbed three ancient religious monuments.
In Thimphu, Bhutan's sedate capital, where overindulgence in rice wine had been the only social vice, a truck driver, bludgeoned his wife to death after she discovered he was addicted to heroin. In Bhutan, family welfare has always come first; then a farmer drove his terrified in-laws off a cliff in a drunken rage.
Why was this kingdom with its head in the clouds falling victim to the kind of crime associated with urban life in America and Europe? For the Bhutanese, the only explanation seemed to be five large satellite dishes, planted in a vegetable patch.
In June 1999, Bhutan became the last nation in the world to turn on television. The Dragon King had lifted a ban on the TV screen as part of a radical plan to modernize his country, and thousands began to pay about $10 per month for cable TV service that provided 46 channels.
Four years later, those same subscribers were beginning to accuse television of smothering their unique culture, of promoting a world that is incompatible with their own, and of threatening to destroy a place where time has stood still for half a millennium.
A refugee monk from Tibet, the Shabdrung, created this tiny country in 1616 as a Buddhist sanctuary, a refuge from the ills of the world. So successful were he and his descendants at isolating themselves that by the 1930s virtually all that was known of Bhutan in the west was James Hilton's novel, Lost Horizon.
He called it Shangri-la, a secret Himalayan valley, whose people never grew old and lived by principles laid down by their high lama: "Here we shall stay with our books and our music and our meditations, conserving the frail elegancies of a dying age."
In the real Bhutan, there were no public hospitals or schools until the 1950s, and no paper currency, roads or electricity until several years after that. Bhutan had no diplomatic relations with any other country until 1961, and the first invited western visitors came only in 1974, for the coronation of Dragon King Jigme Singye Wangchuck. Today, although many people drive cars, there is still no word in the Bhutanese language for "traffic jam."
But none of these developments has made such a fundamental impact on Bhutanese life as TV. Since the April 2002 crime wave, the national newspaper has called for the censoring of television. An editorial warns: "We are seeing for the first time broken families, school dropouts and other negative youth crimes. We are beginning to see crime associated with drug users all over the world - shoplifting, burglary and violence."
The Bhutanese government itself says that it is too early to decide. Sangay Ngedup, minister for health and education, will concede that there is a gulf opening up between old Bhutan and the new: "Until recently, we shied away from killing insects, and yet now we Bhutanese are asked to watch people on TV blowing heads off with shotguns. Will we now be blowing each other's heads off?"
The people of Bhutan, however, finally decided for themselves what would make them happy. France 1998 was driving the soccer-mad kingdom into a frenzy of envy of those who were able to watch the World Cup on television. The small screen had always been prohibited in Bhutan, although the kingdom was crisscrossed by satellite signals that it was finding increasingly difficult to keep out. Even the king was rumored to have a satellite TV at his palace. Faced by recriminations, the government relented and Bhutan's Olympic Committee was permitted to erect a giant screen in a stadium - but only temporarily.
A TV screen in the middle of Thimphu was a revolutionary sight. The current Dragon King's father initiated a careful program of modernization that saw his people embrace the kind of material progress that most western countries take centuries to achieve: education, modern medicine, transportation, currency, electricity. However, mindful of those afraid that foreign influences could destroy Bhutanese culture, he attempted to inhibit conspicuous consumption. No Coca-Cola. And definitely no television.
By France 1998, Bhutan had a new Dragon King and, under growing pressure from an unsettled country, he had a new political agenda. That year, King Jigme Singye Wangchuck announced he would give up his role as head of government and cede power to the national assembly. The people would be consulted about the drafting of a constitution. The process would complete Bhutan's transformation from monarchist Shangri-la into a modern democracy. And television would play its part.
The prime minister of Bhutan, Kinzang Dorji, wants the Bhutanese people to run their own country. "Many are frightened of the responsibility," he said. "A lot of things have changed very quickly in Bhutan, and we do recognise that some people feel lost, at sea. Watching news on the BBC and CNN enables them to see how democracies work in other parts of the world, how people can take charge of their own destinies. The old feudal ways have to end."
The year after France beat Brazil 3-0 in the World Cup final, the people of Thimphu gathered once again in Changlimithang stadium, this time to celebrate the Dragon King's silver jubilee. On June 2, 1999, he stood before them to announce that now they could watch TV whenever they wanted. "But not everything you will see will be good," he warned. "It is my sincere hope that the introduction of television will be beneficial to our people and country."
Inside the headquarters of Sigma Cable, the walls are papered with an X-Files calendar and posters for an HBO show called Hollywood Beauties. Beneath a portrait of the Dragon King, the in-store TV shows wrestling before BeastMaster comes on. A man in tigerskin trunks has trained his marmosets to infiltrate the palace of a barbarian king. When the monarch is decapitated, the children watching outside screech with glee. Inside the office, the staff are fighting for the remote control, channel-hopping. President Bush in a 10-gallon hat welcomes Jiang Zemin to Texas. Midgets wrestle on Star World. Female skaters catfight on Rollerball.
Today, Sigma Cable, whose feed comes from five large satellite dishes at the edge of the city, is the most successful of more than 30 cable operators. Together, they supply virtually the entire country, ensuring that even the folks in remote Trashigang can sit down every night to watch Larry King Live. (info from Guardian News and Media Limited. Photo from Himalayan Tours)
The Bhutanese had always been proud of their incorruptible officials - until Parop Tshering, the chief accountant of the State Trading Corporation, was charged with embezzling. Every aspect of Bhutanese life is steeped in Himalayan Buddhism, and yet thieves vandalized and robbed three ancient religious monuments.
In Thimphu, Bhutan's sedate capital, where overindulgence in rice wine had been the only social vice, a truck driver, bludgeoned his wife to death after she discovered he was addicted to heroin. In Bhutan, family welfare has always come first; then a farmer drove his terrified in-laws off a cliff in a drunken rage.
Why was this kingdom with its head in the clouds falling victim to the kind of crime associated with urban life in America and Europe? For the Bhutanese, the only explanation seemed to be five large satellite dishes, planted in a vegetable patch.
In June 1999, Bhutan became the last nation in the world to turn on television. The Dragon King had lifted a ban on the TV screen as part of a radical plan to modernize his country, and thousands began to pay about $10 per month for cable TV service that provided 46 channels.
Four years later, those same subscribers were beginning to accuse television of smothering their unique culture, of promoting a world that is incompatible with their own, and of threatening to destroy a place where time has stood still for half a millennium.
A refugee monk from Tibet, the Shabdrung, created this tiny country in 1616 as a Buddhist sanctuary, a refuge from the ills of the world. So successful were he and his descendants at isolating themselves that by the 1930s virtually all that was known of Bhutan in the west was James Hilton's novel, Lost Horizon.
He called it Shangri-la, a secret Himalayan valley, whose people never grew old and lived by principles laid down by their high lama: "Here we shall stay with our books and our music and our meditations, conserving the frail elegancies of a dying age."
In the real Bhutan, there were no public hospitals or schools until the 1950s, and no paper currency, roads or electricity until several years after that. Bhutan had no diplomatic relations with any other country until 1961, and the first invited western visitors came only in 1974, for the coronation of Dragon King Jigme Singye Wangchuck. Today, although many people drive cars, there is still no word in the Bhutanese language for "traffic jam."
But none of these developments has made such a fundamental impact on Bhutanese life as TV. Since the April 2002 crime wave, the national newspaper has called for the censoring of television. An editorial warns: "We are seeing for the first time broken families, school dropouts and other negative youth crimes. We are beginning to see crime associated with drug users all over the world - shoplifting, burglary and violence."
The Bhutanese government itself says that it is too early to decide. Sangay Ngedup, minister for health and education, will concede that there is a gulf opening up between old Bhutan and the new: "Until recently, we shied away from killing insects, and yet now we Bhutanese are asked to watch people on TV blowing heads off with shotguns. Will we now be blowing each other's heads off?"
The people of Bhutan, however, finally decided for themselves what would make them happy. France 1998 was driving the soccer-mad kingdom into a frenzy of envy of those who were able to watch the World Cup on television. The small screen had always been prohibited in Bhutan, although the kingdom was crisscrossed by satellite signals that it was finding increasingly difficult to keep out. Even the king was rumored to have a satellite TV at his palace. Faced by recriminations, the government relented and Bhutan's Olympic Committee was permitted to erect a giant screen in a stadium - but only temporarily.
A TV screen in the middle of Thimphu was a revolutionary sight. The current Dragon King's father initiated a careful program of modernization that saw his people embrace the kind of material progress that most western countries take centuries to achieve: education, modern medicine, transportation, currency, electricity. However, mindful of those afraid that foreign influences could destroy Bhutanese culture, he attempted to inhibit conspicuous consumption. No Coca-Cola. And definitely no television.
By France 1998, Bhutan had a new Dragon King and, under growing pressure from an unsettled country, he had a new political agenda. That year, King Jigme Singye Wangchuck announced he would give up his role as head of government and cede power to the national assembly. The people would be consulted about the drafting of a constitution. The process would complete Bhutan's transformation from monarchist Shangri-la into a modern democracy. And television would play its part.
The prime minister of Bhutan, Kinzang Dorji, wants the Bhutanese people to run their own country. "Many are frightened of the responsibility," he said. "A lot of things have changed very quickly in Bhutan, and we do recognise that some people feel lost, at sea. Watching news on the BBC and CNN enables them to see how democracies work in other parts of the world, how people can take charge of their own destinies. The old feudal ways have to end."
The year after France beat Brazil 3-0 in the World Cup final, the people of Thimphu gathered once again in Changlimithang stadium, this time to celebrate the Dragon King's silver jubilee. On June 2, 1999, he stood before them to announce that now they could watch TV whenever they wanted. "But not everything you will see will be good," he warned. "It is my sincere hope that the introduction of television will be beneficial to our people and country."
Inside the headquarters of Sigma Cable, the walls are papered with an X-Files calendar and posters for an HBO show called Hollywood Beauties. Beneath a portrait of the Dragon King, the in-store TV shows wrestling before BeastMaster comes on. A man in tigerskin trunks has trained his marmosets to infiltrate the palace of a barbarian king. When the monarch is decapitated, the children watching outside screech with glee. Inside the office, the staff are fighting for the remote control, channel-hopping. President Bush in a 10-gallon hat welcomes Jiang Zemin to Texas. Midgets wrestle on Star World. Female skaters catfight on Rollerball.
Today, Sigma Cable, whose feed comes from five large satellite dishes at the edge of the city, is the most successful of more than 30 cable operators. Together, they supply virtually the entire country, ensuring that even the folks in remote Trashigang can sit down every night to watch Larry King Live. (info from Guardian News and Media Limited. Photo from Himalayan Tours)
Thursday, February 7, 2008
1991: last American TV maker
The Zenith company began in Chicago, Illinois, in 1918 as a small producer of amateur radio equipment. The name "Zenith" came from its founders' call sign, 9ZN, and Zenith Radio Company was formally incorporated in 1923.
Zenith introduced the first portable radio not long after this, and would eventually go on to invent such things as the wireless remote control, FM multiplex stereo, high-contrast and flat-face picture tubes, and the MTS stereo system used on analog television broadcasts in the US and Canada. Zenith was one of the first companies with a digital high definition TV system.
In the 1980s, Zenith fell on hard times as more and more of the TV business went to Japanese companies with lower prices. In 1979, they got into the computer business with the purchase of Heath Company and their H-8 computer kit. Zenith renamed Heath's computer division Zenith Data Systems, and eventually sold ZDS and Heath to Groupe Bull in 1989 to raise money for hi-def TV research efforts. Zenith changed its name to Zenith Electronics Corporation in 1984, to reflect its interests in computers and Cable TV, and since it had left the radio business two years earlier.
By 1990, Zenith was in trouble, and looking more and more attractive to a hostile takeover. To avoid this, Zenith sold 5% of itself to LG Electronics as part of a technology-sharing agreement. In 1991, it moved its last American production facility to Mexico. With their analog line aging (the last major update had been in 1978), and the adoption of HDTV in the US years away, Zenith's prospects were dim.
Eventually, LG would raise its stake in Zenith to 55%, enough to assume a controlling interest. Zenith eventually filed for bankruptcy in 1999, and in exchange for its debts, LG offered to buy the part of Zenith it didn't already own. Today LG uses the Zenith brand on a limited number of TV models, and seems to periodically kill and revive the brand. (info from Wikipedia and The New York Times)
Zenith introduced the first portable radio not long after this, and would eventually go on to invent such things as the wireless remote control, FM multiplex stereo, high-contrast and flat-face picture tubes, and the MTS stereo system used on analog television broadcasts in the US and Canada. Zenith was one of the first companies with a digital high definition TV system.
In the 1980s, Zenith fell on hard times as more and more of the TV business went to Japanese companies with lower prices. In 1979, they got into the computer business with the purchase of Heath Company and their H-8 computer kit. Zenith renamed Heath's computer division Zenith Data Systems, and eventually sold ZDS and Heath to Groupe Bull in 1989 to raise money for hi-def TV research efforts. Zenith changed its name to Zenith Electronics Corporation in 1984, to reflect its interests in computers and Cable TV, and since it had left the radio business two years earlier.
By 1990, Zenith was in trouble, and looking more and more attractive to a hostile takeover. To avoid this, Zenith sold 5% of itself to LG Electronics as part of a technology-sharing agreement. In 1991, it moved its last American production facility to Mexico. With their analog line aging (the last major update had been in 1978), and the adoption of HDTV in the US years away, Zenith's prospects were dim.
Eventually, LG would raise its stake in Zenith to 55%, enough to assume a controlling interest. Zenith eventually filed for bankruptcy in 1999, and in exchange for its debts, LG offered to buy the part of Zenith it didn't already own. Today LG uses the Zenith brand on a limited number of TV models, and seems to periodically kill and revive the brand. (info from Wikipedia and The New York Times)
Wednesday, February 6, 2008
1892: first voting machine
The paper ballot system, with standardized voting forms, was first adopted in the Australian state of Victoria in 1856, and in the remaining Australian states over the next several years. The paper ballot became known as the "Australian ballot," and New York was the first American state to use it, in 1889.
The first official use of a lever type voting machine, known then as the "Myers Automatic Booth," was in Lockport, NY in 1892. Four years later, they were employed on a large scale in Rochester, NY, and soon were adopted statewide. By 1930, lever machines had been installed in virtually every major city in the US, and by the 1960’s well over half of US votes were being cast on these machines.
On mechanical lever voting machines, the name of each candidate or ballot issue choice is assigned a particular lever in a rectangular array of levers on the front of the machine. A set of printed strips visible to the voters identifies the lever assignment for each candidate and issue choice. The levers are horizontal in their unvoted positions.
The voter activates the machine with a lever that also closes a privacy curtain. The voter pulls down selected levers to indicate choices. When the voter exits the booth by opening the privacy curtain with the handle, the voted levers are automatically returned to their original horizontal position. As each lever returns, it causes a connected counter wheel within the machine to turn one-tenth of a full rotation. The counter wheel, serving as the "ones" position of the numerical count for the associated lever, drives a "tens" counter one-tenth of a rotation for each of its full rotations. The "tens" counter similarly drives a "hundreds" counter.
If all mechanical connections are fully operational during the voting period, and the counters are initially set to zero, the position of each counter at the close of the polls indicates the number of votes cast on the lever that drives it. Interlocks in the machine prevent the voter from voting for more choices than permitted.
Because these machines are no longer made, the trend is to replace them with computer-based or direct recording electronic systems. (info from About.com, photo from the National Museum of American History of the Smithsonian Institution)
The first official use of a lever type voting machine, known then as the "Myers Automatic Booth," was in Lockport, NY in 1892. Four years later, they were employed on a large scale in Rochester, NY, and soon were adopted statewide. By 1930, lever machines had been installed in virtually every major city in the US, and by the 1960’s well over half of US votes were being cast on these machines.
On mechanical lever voting machines, the name of each candidate or ballot issue choice is assigned a particular lever in a rectangular array of levers on the front of the machine. A set of printed strips visible to the voters identifies the lever assignment for each candidate and issue choice. The levers are horizontal in their unvoted positions.
The voter activates the machine with a lever that also closes a privacy curtain. The voter pulls down selected levers to indicate choices. When the voter exits the booth by opening the privacy curtain with the handle, the voted levers are automatically returned to their original horizontal position. As each lever returns, it causes a connected counter wheel within the machine to turn one-tenth of a full rotation. The counter wheel, serving as the "ones" position of the numerical count for the associated lever, drives a "tens" counter one-tenth of a rotation for each of its full rotations. The "tens" counter similarly drives a "hundreds" counter.
If all mechanical connections are fully operational during the voting period, and the counters are initially set to zero, the position of each counter at the close of the polls indicates the number of votes cast on the lever that drives it. Interlocks in the machine prevent the voter from voting for more choices than permitted.
Because these machines are no longer made, the trend is to replace them with computer-based or direct recording electronic systems. (info from About.com, photo from the National Museum of American History of the Smithsonian Institution)
Tuesday, February 5, 2008
1968: first 911 call
9-1-1 or 911 (usually pronounced "nine-one-one") is the emergency telephone number for the North American Numbering Plan (NANP).
Before the dial telephone came into widespread usage, a telephone caller simply picked up the phone receiver or handset and waited for the operator to answer "number please?"
The caller then said "connect me to the police," "I want to report a fire," or "I need an ambulance." It was usually not necessary to ask for any of these services by number, even in a large city. Furthermore, the operator instantly knew the calling party's number even if he couldn't stay on the line by simply looking at the number above the line jack of the calling party.
In small towns, telephone operators frequently went the extra mile by making sure they knew the locations of local doctors, vets, law enforcement personnel, and even private citizens who were willing or able to help in an emergency. Frequently, the operator also activated the town's fire alarm.
When cities and towns began to convert to dial or, "automatic" telephone service, many people were concerned about the loss of the personalized service that had been provided by local operators. This problem was partially solved by telling people to dial "0" for the local assistance operator if they did not know the Fire or Police Department's full number.
Generations of school children were taught to "dial 0 in case of emergency," and this situation remained in place in some areas into the early 1980s. Now, children are taught to call 911.
The push for the development of a nationwide emergency telephone number came in 1957 when the National Association of Fire Chiefs recommended a single number to be used for reporting fires. In 1967 the President's Commission on Law Enforcement and Administration of Justice recommended the creation of a single number that can be used nationwide for reporting emergencies. The burden then fell on the Federal Communications Commission, which then met with AT&T in November 1967 in order to come up with a solution.
In 1968, a solution was agreed upon. AT&T had chosen the number 911, which met the requirements that it be brief, easy to remember, dialed easily, and that it worked well with the phone systems in place at the time. How the number 911 itself was chosen is not well known and is subject to much speculation. However, many assert that the number 911 was chosen to be similar to the numbers 2-1-1 (long distance), 4-1-1 (information, later called "directory assistance"), and 6-1-1 (repair service), which had already been in use by AT&T since 1966. Also, it was necessary to ensure that the 9-1-1 number was not dialed accidentally, so 9-1-1 made sense because the numbers "9" and "1" were on opposite ends of a dial.
Furthermore, the North American Numbering Plan in use at the time established rules for which numbers can be used for area codes and exchanges. At the time, the middle digit of an area code had to be either a 0 or 1, and the first two digits of an exchange could not be a 1. At the telephone switching station, the second dialed digit was used to determine if the number was long distance or local. If the number had a 0 or 1 as the second digit, it was long distance, and it was a local call if it was any other number. Thus, since the number 911 was detected by the switching equipment as a special number, it could be routed appropriately. Also, since 911 was a unique number, never having been used as an area code or service code .
AT&T announced the selection of 9-1-1 as their choice of the three-digit emergency number at a press conference in Washington, DC.
In Alabama, Bob Gallagher, president of the independent Alabama Telephone Co. read an article in the Wall Street Journal, which reported the AT&T 911 announcement. Gallagher’s competitive spirit motivated him to beat AT&T to the punch by being the first to implement the 911 service, somewhere within the Alabama Telephone Co. territory.
He contacted Robert Fitzgerald, who was Inside State Plant Manager for ATC, who in turn recommended Haleyville, Alabama as the prime site. Gallagher later issued a press release announcing that the 911 service would begin in Haleyville on Feb. 16, 1968. Fitzgerald designed the circuitry and with the assistance of technicians Jimmy White, Glenn Johnston, Al Bush and Pete Gosa, they quickly completed the central office work and installation.
Just 35 days after AT&T's announcement, on February 16, 1968, the first-ever 9-1-1 call was placed by Alabama Speaker of the House Rankin Fite from Haleyville City Hall to US Rep. Tom Bevill at the city's police station. Bevill reportedly answered the phone with "Hello." Attending with Fite was Haleyville mayor James Whitt. At the police station with Bevill was Gallagher and Alabama Public Service Commission director Eugene "Bull" Connor (formerly the Birmingham police chief involved in federal desegregation). Fitzgerald was at the ATC central office serving Haleyville, and actually observed the call pass through the switching gear, as the mechanical equipment clunked out "9-1-1."
In 1973, the White House urged nationwide adoption of 911, but widespread adoption was slowed by the public's traditional reliance upon the live human operators which AT&T continued to make available at no cost to anyone who dialed 'O' from any telephone.
It wasn't until AT&T was broken up into seven regional operating companies as a result of anti-trust action on January 1, 1984 that no-cost access to human operators began to become increasingly unavailable and cities and counties began to perceive a real need to spend the money to create 911 call centers to support the emergency number.
"9-1-1 Emergency Telephone Number Day" was proclaimed, by President Ronald Reagan in 1987 to encourage adoption across the country. In 1999, President Bill Clinton signed the bill that designated 911 as the nationwide emergency number. Even though 9-1-1 was first introduced in 1968, as of 2008 the network still does not completely cover some rural areas of the United States and Canada. (info from Wikipedia)
Before the dial telephone came into widespread usage, a telephone caller simply picked up the phone receiver or handset and waited for the operator to answer "number please?"
The caller then said "connect me to the police," "I want to report a fire," or "I need an ambulance." It was usually not necessary to ask for any of these services by number, even in a large city. Furthermore, the operator instantly knew the calling party's number even if he couldn't stay on the line by simply looking at the number above the line jack of the calling party.
In small towns, telephone operators frequently went the extra mile by making sure they knew the locations of local doctors, vets, law enforcement personnel, and even private citizens who were willing or able to help in an emergency. Frequently, the operator also activated the town's fire alarm.
When cities and towns began to convert to dial or, "automatic" telephone service, many people were concerned about the loss of the personalized service that had been provided by local operators. This problem was partially solved by telling people to dial "0" for the local assistance operator if they did not know the Fire or Police Department's full number.
Generations of school children were taught to "dial 0 in case of emergency," and this situation remained in place in some areas into the early 1980s. Now, children are taught to call 911.
The push for the development of a nationwide emergency telephone number came in 1957 when the National Association of Fire Chiefs recommended a single number to be used for reporting fires. In 1967 the President's Commission on Law Enforcement and Administration of Justice recommended the creation of a single number that can be used nationwide for reporting emergencies. The burden then fell on the Federal Communications Commission, which then met with AT&T in November 1967 in order to come up with a solution.
In 1968, a solution was agreed upon. AT&T had chosen the number 911, which met the requirements that it be brief, easy to remember, dialed easily, and that it worked well with the phone systems in place at the time. How the number 911 itself was chosen is not well known and is subject to much speculation. However, many assert that the number 911 was chosen to be similar to the numbers 2-1-1 (long distance), 4-1-1 (information, later called "directory assistance"), and 6-1-1 (repair service), which had already been in use by AT&T since 1966. Also, it was necessary to ensure that the 9-1-1 number was not dialed accidentally, so 9-1-1 made sense because the numbers "9" and "1" were on opposite ends of a dial.
Furthermore, the North American Numbering Plan in use at the time established rules for which numbers can be used for area codes and exchanges. At the time, the middle digit of an area code had to be either a 0 or 1, and the first two digits of an exchange could not be a 1. At the telephone switching station, the second dialed digit was used to determine if the number was long distance or local. If the number had a 0 or 1 as the second digit, it was long distance, and it was a local call if it was any other number. Thus, since the number 911 was detected by the switching equipment as a special number, it could be routed appropriately. Also, since 911 was a unique number, never having been used as an area code or service code .
AT&T announced the selection of 9-1-1 as their choice of the three-digit emergency number at a press conference in Washington, DC.
In Alabama, Bob Gallagher, president of the independent Alabama Telephone Co. read an article in the Wall Street Journal, which reported the AT&T 911 announcement. Gallagher’s competitive spirit motivated him to beat AT&T to the punch by being the first to implement the 911 service, somewhere within the Alabama Telephone Co. territory.
He contacted Robert Fitzgerald, who was Inside State Plant Manager for ATC, who in turn recommended Haleyville, Alabama as the prime site. Gallagher later issued a press release announcing that the 911 service would begin in Haleyville on Feb. 16, 1968. Fitzgerald designed the circuitry and with the assistance of technicians Jimmy White, Glenn Johnston, Al Bush and Pete Gosa, they quickly completed the central office work and installation.
Just 35 days after AT&T's announcement, on February 16, 1968, the first-ever 9-1-1 call was placed by Alabama Speaker of the House Rankin Fite from Haleyville City Hall to US Rep. Tom Bevill at the city's police station. Bevill reportedly answered the phone with "Hello." Attending with Fite was Haleyville mayor James Whitt. At the police station with Bevill was Gallagher and Alabama Public Service Commission director Eugene "Bull" Connor (formerly the Birmingham police chief involved in federal desegregation). Fitzgerald was at the ATC central office serving Haleyville, and actually observed the call pass through the switching gear, as the mechanical equipment clunked out "9-1-1."
In 1973, the White House urged nationwide adoption of 911, but widespread adoption was slowed by the public's traditional reliance upon the live human operators which AT&T continued to make available at no cost to anyone who dialed 'O' from any telephone.
It wasn't until AT&T was broken up into seven regional operating companies as a result of anti-trust action on January 1, 1984 that no-cost access to human operators began to become increasingly unavailable and cities and counties began to perceive a real need to spend the money to create 911 call centers to support the emergency number.
"9-1-1 Emergency Telephone Number Day" was proclaimed, by President Ronald Reagan in 1987 to encourage adoption across the country. In 1999, President Bill Clinton signed the bill that designated 911 as the nationwide emergency number. Even though 9-1-1 was first introduced in 1968, as of 2008 the network still does not completely cover some rural areas of the United States and Canada. (info from Wikipedia)
Monday, February 4, 2008
1959: the night the music died
On February 3, 1959, a small-plane crash near Clear Lake, Iowa, killed three popular American rock and roll musicians: Buddy Holly, Ritchie Valens, and J.P. "The Big Bopper" Richardson, as well as the pilot, Roger Peterson. The day was later called The Day the Music Died by Don McLean in his 1971 tribute song about the crash, American Pie".
"The Winter Dance Party" was a tour that was set to cover 24 Midwestern cities in three weeks. A logistical problem with the tour was the amount of travel, as the distance between venues was not a priority when scheduling each performance. For example, the tour would start at venue A, travel two hundred miles to venue B, and travel back one hundred seventy miles to venue C, which was only thirty miles from venue A. Adding to the disarray, the tour bus used to carry the musicians was ill-prepared for the weather; its heating system broke shortly after the tour began. Drummer Carl Bunch developed a severe case of frostbitten feet while on the bus and was taken to a local hospital. As he recovered, Buddy Holly and Ritchie Valens took turns with the drums.
The Surf Ballroom in Clear Lake, Iowa was never intended to be a stop on the tour, but promoters, hoping to fill an open date, called the manager of the ballroom at the time and offered him the show. He accepted and the date of the show was set for February 2.
When Buddy Holly arrived at the ballroom that evening, he had had enough of the tour bus, and asked his bandmates that, once the show was over, they should try to charter a plane to get to the next stop on the tour, an armory in Moorhead, Minnesota. The destination of the flight was Hector Airport in Fargo, North Dakota (directly across the Red River from Moorhead), as Moorhead did not have an airport. According to VH-1's Behind the Music: The Day the Music Died, Holly was also upset that he had run out of clean undershirts, socks, and underwear, and Holly said he needed to do some laundry before the next performance, and the local laundromat in Clear Lake was closed for repairs.
Flight arrangements were made with Roger Peterson, 21, a local pilot who worked for Dwyer Flying Service in Mason City, Iowa. A fee of $36 per person was charged for the single engine Beechcraft Bonanza, which could seat three passengers in addition to the pilot.
Richardson had developed a case of the flu during the tour (erroneously thought to have been caused by riding on the unheated bus) and asked one of Holly's bandmates, Waylon Jennings, for his seat on the plane; Jennings agreed to give up the seat. According to an account by Jennings years later, when Holly heard about this, his reply to Jennings was, "Well, I hope your ole bus freezes up!" to which Jennings replied, "Well, I hope your damn plane crashes!" This exchange of words, though made in jest at the time, haunted Jennings for many years afterward.
Ritchie Valens had never flown in a small plane before, and asked Holly's remaining bandmate on the plane, Tommy Allsup, for the seat. Tommy said "I'll flip ya for the remaining seat." Contrary to what is seen in biographical movies, that coin toss did not happen at the airport shortly before takeoff, nor did Buddy Holly toss it. The toss happened at the ballroom shortly before departure to the airport, and the coin was tossed by a DJ who was working the concert that night. Valens won a seat on the plane.
Dion DiMucci of Dion & The Belmonts, who was the fourth headliner on the tour, was approached to join the flight as well; however, the price of $36 was too much. Dion had heard his parents argue for years over the $36 rent for their apartment and could not bring himself to pay an entire month's rent for a short plane ride.
At approximately 1:00 AM Central Time on February 3, the plane took off from Mason City Municipal Airport. Around 1:05, Jerry Dwyer, owner of Dwyer Flying Service, could see the lights of the plane start to descend from the sky to the ground. At the time, he thought it was an optical illusion because of the curvature of the Earth and the horizon.
The pilot, Roger Peterson, was expected to file his flight plan once the plane was airborne, but Peterson never called the tower. Repeated attempts by Dwyer to contact his pilot failed. By 3:30 AM, when the airport at Fargo had not heard from Peterson, Dwyer contacted authorities and reported the aircraft missing.
Around 9:15 in the morning, Dwyer took off in another small plane to fly Peterson's intended route. A short time later he spotted the wreckage in a cornfield about five miles northwest of the airport. The manager of the Surf Ballroom (who drove the performers to the airport, and also witnessed the plane taking off) made the positive identification of the performers.
The Bonanza was at a slight downward angle and banked to the right when it struck the ground at around 170 mph. The plane tumbled and skidded another 570 feet across the frozen landscape before the crumpled ball of wreckage piled against a wire fence at the edge of the property. The bodies of Holly and Valens lay near the plane, Richardson was thrown into a neighboring cornfield, and Peterson remained trapped inside. All four had died instantly from "gross trauma" to the brain, the county coroner declared.
Investigators came to the conclusion that the crash was due to a combination of poor weather conditions and pilot error. Peterson had done poorly on previous flight instrumentation tests and had not been rated for night-time flight, when he would have to rely on his instruments rather than his own vision. It was also found that Peterson was not given an accurate advisory of the weather conditions of his route, which, given his known limitations, might have caused him to postpone the flight. (info from Wikipedia)
Friday, February 1, 2008
2009: Isuzu stops selling cars in the US
Isuzu, the Japanese automaker famous for its commercials featuring a less-than-honest salesman, said Wednesday that it would stop selling passenger vehicles in the US, as sales declined to almost nothing in recent years.
The move, effective at the end of January, 2009, marks the end of a 27-year run here and makes Isuzu the first Asian car company to abandon the world's largest market since South Korea's financially troubled Daewoo stopped selling here in 2002.
Although Isuzu's US sales were paltry recently -- it made just 7,098 of the 16.1 million new vehicles sold in the country last year -- the company's influence on the industry, as innovator and promoter of sport utility vehicles such as the Trooper and the Rodeo in the 1980s and '90s, was huge.
The decision to cease distribution was based in part on the fact that the two models Isuzu currently sells here, the Ascender SUV and the i-series pickup (both manufactured by General Motors for Isuzu), will no longer be made, and the company has no plans to design a new vehicle.
Isuzu will instead focus on its commercial trucks, a much larger business in the U.S. and abroad. In the fiscal year that ended last March, Isuzu earned $910 million on $14.2 billion in worldwide sales. The company said that quitting the US passenger vehicle market would cost about $37 million over two years, including fees to franchised dealers.
One individual who wasn't surprised to hear the news was Joe Isuzu himself. David Leisure, who portrayed the salesman in a series of Isuzu ads in the late 1980s and early '90s, said he hadn't seen an Isuzu on the road for years. "I thought they already had stopped selling here."
Leisure, an unknown actor before landing the role, gained fame for smarmy lines such as "If I'm lying, may lightning hit my mother," in goofy commercials that promised vehicles that could drive faster than a speeding bullet. The commercials, he said, landed him acting jobs on several comedy series, and in 2001, he made a brief return to Isuzu ads, where he pitched the Axiom. "Isuzu gave me an actual career," said Leisure, who has owned two Troopers over the years.
Joe Isuzu, meanwhile, made Isuzu a very popular brand. After building pickups for GM in the 1970s, Isuzu began selling its own vehicles in the US in 1981, with low-cost vehicles such as the Pup pickup and the Trooper. Aside from the Jeep Cherokee, the Trooper was the only four-door SUV available for years, and it helped build a market for what would become a hugely successful category.
Isuzu's US sales peaked at 127,630 vehicles in 1986. But as other carmakers entered the SUV market, Isuzu lost its position as an SUV innovator, instead focusing on producing sedans, favoring joint-venture deals with Subaru and other carmakers, and selling Rodeo SUVs to Honda, which re-badged them as Passports.
By 1997, sales were down to 91,483 units, and introductions of new models such as the Axiom and Ascender did little to halt the slide. Isuzu's 2007 volume was the smallest of any Asian carmaker in the US. (info from the Los Angeles Times)
The move, effective at the end of January, 2009, marks the end of a 27-year run here and makes Isuzu the first Asian car company to abandon the world's largest market since South Korea's financially troubled Daewoo stopped selling here in 2002.
Although Isuzu's US sales were paltry recently -- it made just 7,098 of the 16.1 million new vehicles sold in the country last year -- the company's influence on the industry, as innovator and promoter of sport utility vehicles such as the Trooper and the Rodeo in the 1980s and '90s, was huge.
The decision to cease distribution was based in part on the fact that the two models Isuzu currently sells here, the Ascender SUV and the i-series pickup (both manufactured by General Motors for Isuzu), will no longer be made, and the company has no plans to design a new vehicle.
Isuzu will instead focus on its commercial trucks, a much larger business in the U.S. and abroad. In the fiscal year that ended last March, Isuzu earned $910 million on $14.2 billion in worldwide sales. The company said that quitting the US passenger vehicle market would cost about $37 million over two years, including fees to franchised dealers.
One individual who wasn't surprised to hear the news was Joe Isuzu himself. David Leisure, who portrayed the salesman in a series of Isuzu ads in the late 1980s and early '90s, said he hadn't seen an Isuzu on the road for years. "I thought they already had stopped selling here."
Leisure, an unknown actor before landing the role, gained fame for smarmy lines such as "If I'm lying, may lightning hit my mother," in goofy commercials that promised vehicles that could drive faster than a speeding bullet. The commercials, he said, landed him acting jobs on several comedy series, and in 2001, he made a brief return to Isuzu ads, where he pitched the Axiom. "Isuzu gave me an actual career," said Leisure, who has owned two Troopers over the years.
Joe Isuzu, meanwhile, made Isuzu a very popular brand. After building pickups for GM in the 1970s, Isuzu began selling its own vehicles in the US in 1981, with low-cost vehicles such as the Pup pickup and the Trooper. Aside from the Jeep Cherokee, the Trooper was the only four-door SUV available for years, and it helped build a market for what would become a hugely successful category.
Isuzu's US sales peaked at 127,630 vehicles in 1986. But as other carmakers entered the SUV market, Isuzu lost its position as an SUV innovator, instead focusing on producing sedans, favoring joint-venture deals with Subaru and other carmakers, and selling Rodeo SUVs to Honda, which re-badged them as Passports.
By 1997, sales were down to 91,483 units, and introductions of new models such as the Axiom and Ascender did little to halt the slide. Isuzu's 2007 volume was the smallest of any Asian carmaker in the US. (info from the Los Angeles Times)