Catherine N. Pollard was the first woman to be a scoutmaster for the Boy Scouts of America, heading a Milford, CT troop from 1973 to 1975 because no men had volunteered. She was a widow who lived with a son and daughter, who described herself as a "tomboy who enjoys camping, chops her own wood, drives a tractor and rides a motorcycle."
During this time she was not registered as the troop's Scoutmaster. In both 1974 and 1976, Catherine attempted to register as the troop's Scoutmaster, but each time her application was rejected. BSA stated that boys needed male role models and she would not be able to provide that. Finally she decided to file a complaint for sex discrimination with the Connecticut Commission on Human Rights and Opportunities.
When she filed the complaint, she was forced to relinquish her leadership role in Troop 13. Once again, no man stepped forward to take over as Scoutmaster. As a result, the troop was forced to disband. Catherine rejected BSA's role-model argument. "Who models the male in the first place?" she asked. "The mothers do. The fathers are out working."
The state Commission on Human Rights and Opportunities sided with her, but state courts reversed the ruling. In 1987 the state Supreme Court upheld a lower-court ruling that boys “in the difficult process of maturing to adulthood” needed the guidance of men.
In February 1988, however, the Boy Scouts of America did away with all gender restrictions on volunteer positions, and Ms. Pollard, who was 69 at the time, became a scoutmaster in Milford.
“I do think that this is marvelous,” she said at the time, “because there have been women all over the United States, in fact all over the world, that have been doing these things for the Boy Scouts because they could not get a male leader, but we could not get recognition.” (info from The New York Times and bsa-discrimination.org)
Thursday, January 31, 2008
Wednesday, January 30, 2008
1979: Ireland ends ban on contraception
Until 1979, in Catholic-dominated Ireland, the law prohibited importation and sale of contraceptives, despite the fact that most adults supported the view that birth control was a basic human right. The ban on contraception was a prime reason for the large size of Irish families, that was burdensome in poor economic times, and led to emigration to the US.
Before the availability of “the pill” and condom, the most frequent form of contraception in Ireland was coitus interruptus. Many Irish women were shocked to find that they were pregnant even though “he pulled out in time.” Also, men who could not get condoms were known to fashion their own from Saran Wrap. (info from International Encyclopedia)
Before the availability of “the pill” and condom, the most frequent form of contraception in Ireland was coitus interruptus. Many Irish women were shocked to find that they were pregnant even though “he pulled out in time.” Also, men who could not get condoms were known to fashion their own from Saran Wrap. (info from International Encyclopedia)
Tuesday, January 29, 2008
1790: first State of the Union Address
George Washington gave the first State of the Union address on January 8, 1790 in New York City, then the provisional US capital.
In 1801, Thomas Jefferson discontinued the practice of delivering the address in person, regarding it as too monarchical (similar to the Speech from the Throne). Instead, the address was written and then sent to Congress to be read by a clerk until 1913 when Woodrow Wilson re-established the practice.
However, some presidents in the second half of the 20th Century did send written addresses. The last President to do so was Jimmy Carter in 1980.
For many years, the speech was referred to as "the President's Annual Message to Congress." The actual term "State of the Union" did not become widely used until after 1935 when Franklin D. Roosevelt began using the phrase.
Prior to 1934, the annual message was delivered at the end of the calendar year, in December. The ratification of Amendment XX on January 23, 1933 changed the opening of Congress from early March to early January, affecting the delivery of the annual message. Since 1934, the address has been delivered to Congress in January or February. Now, the speech is typically delivered on the last Tuesday in January, although there is no such provision written in law, and it varies from year to year.
The Twentieth Amendment also established January 20 as the beginning of the presidential term. In years when a new president is inaugurated, the outgoing president may deliver a final State of the Union message, but none has done so since Jimmy Carter sent a written message in 1981. In 1953 and 1961, Congress received both a written State of the Union message from the outgoing president and a separate State of the Union speech by the incoming president.
Since 1989, in recognition that the responsibility of reporting the State of the Union formally belongs to the president who held office during the past year, newly inaugurated Presidents have not officially called their first speech before Congress a "State of the Union" message.
Calvin Coolidge's 1923 speech was the first to be broadcast on radio. Harry S. Truman's 1947 address was the first to be broadcast on television. Lyndon Johnson's address in 1965 was the first delivered in the evening. Bill Clinton gave his 1999 address while his impeachment trial was underway, and his 1997 address was the first broadcast available live on the World Wide Web. Ronald Reagan was the only president to have postponed his State of the Union address. On January 28, 1986, he planned to give his address, but after learning of the Space Shuttle Challenger disaster, he postponed it for a week and addressed the nation on the day's events. (info from Wikipedia)
In 1801, Thomas Jefferson discontinued the practice of delivering the address in person, regarding it as too monarchical (similar to the Speech from the Throne). Instead, the address was written and then sent to Congress to be read by a clerk until 1913 when Woodrow Wilson re-established the practice.
However, some presidents in the second half of the 20th Century did send written addresses. The last President to do so was Jimmy Carter in 1980.
For many years, the speech was referred to as "the President's Annual Message to Congress." The actual term "State of the Union" did not become widely used until after 1935 when Franklin D. Roosevelt began using the phrase.
Prior to 1934, the annual message was delivered at the end of the calendar year, in December. The ratification of Amendment XX on January 23, 1933 changed the opening of Congress from early March to early January, affecting the delivery of the annual message. Since 1934, the address has been delivered to Congress in January or February. Now, the speech is typically delivered on the last Tuesday in January, although there is no such provision written in law, and it varies from year to year.
The Twentieth Amendment also established January 20 as the beginning of the presidential term. In years when a new president is inaugurated, the outgoing president may deliver a final State of the Union message, but none has done so since Jimmy Carter sent a written message in 1981. In 1953 and 1961, Congress received both a written State of the Union message from the outgoing president and a separate State of the Union speech by the incoming president.
Since 1989, in recognition that the responsibility of reporting the State of the Union formally belongs to the president who held office during the past year, newly inaugurated Presidents have not officially called their first speech before Congress a "State of the Union" message.
Calvin Coolidge's 1923 speech was the first to be broadcast on radio. Harry S. Truman's 1947 address was the first to be broadcast on television. Lyndon Johnson's address in 1965 was the first delivered in the evening. Bill Clinton gave his 1999 address while his impeachment trial was underway, and his 1997 address was the first broadcast available live on the World Wide Web. Ronald Reagan was the only president to have postponed his State of the Union address. On January 28, 1986, he planned to give his address, but after learning of the Space Shuttle Challenger disaster, he postponed it for a week and addressed the nation on the day's events. (info from Wikipedia)
Monday, January 28, 2008
1955: first black man sings at the Met
In 1953, baritone Robert McFerrin Sr. won the Metropolitan Opera national auditions. His 1955 debut with the Opera as Amonasro in “Aida” made him the first black man to perform with the company, and he sung in 10 operas over three seasons. He was the father of conductor-vocalist Bobby McFerrin.
He appeared just three weeks after contralto Marian Anderson made her historic debut as the first black to sing a principal role at the Met.
McFerrin provided the vocals for Sidney Poitier in the 1959 movie “Porgy and Bess.” He also sang with both of his children, Bobby and Brenda McFerrin.
In 1993, father and son appeared with the St. Louis Symphony — the older man as soloist, the younger as guest conductor. “His work influenced everything I do musically,” Bobby McFerrin told The Associated Press in 2003.
Robert McFerrin was born in Marianna, Ark., one of eight children of a strict Baptist minister who forbade his son to sing anything but gospel music. That changed when he moved to St. Louis in 1936 and a music teacher discovered and encouraged his talent. In the late 1940s and early ’50s, McFerrin sang on Broadway, performed with the National Negro Opera Company and the New York City Opera.
He moved back to St. Louis in 1973. He suffered a stroke in 1989, but his singing voice remained. In June 2003 he was honored by Opera America, the national services organization. He died in 2006. (info and photo from The New York Times)
He appeared just three weeks after contralto Marian Anderson made her historic debut as the first black to sing a principal role at the Met.
McFerrin provided the vocals for Sidney Poitier in the 1959 movie “Porgy and Bess.” He also sang with both of his children, Bobby and Brenda McFerrin.
In 1993, father and son appeared with the St. Louis Symphony — the older man as soloist, the younger as guest conductor. “His work influenced everything I do musically,” Bobby McFerrin told The Associated Press in 2003.
Robert McFerrin was born in Marianna, Ark., one of eight children of a strict Baptist minister who forbade his son to sing anything but gospel music. That changed when he moved to St. Louis in 1936 and a music teacher discovered and encouraged his talent. In the late 1940s and early ’50s, McFerrin sang on Broadway, performed with the National Negro Opera Company and the New York City Opera.
He moved back to St. Louis in 1973. He suffered a stroke in 1989, but his singing voice remained. In June 2003 he was honored by Opera America, the national services organization. He died in 2006. (info and photo from The New York Times)
Friday, January 25, 2008
1789: first presidential election in the US
The US presidential election of 1789 was the first held in the manner described by the newly established Constitution.
Before this time, the US had no Presidential office but instead gave limited president-like powers to the Chairman of Congress, a position like the current Speaker of the House or President of the Senate.
George Washington had little opposition for election as President. Under the system then in place, each voting elector cast two votes, and the recipient of the greatest number of votes was elected President, providing they equaled or exceeded half the total number of electors.
The runner-up became Vice President. At that time, the Twelfth Amendment to the United States Constitution had not been passed and thus the electoral system for that era differs from most elections. Washington was now very popular, as he successfully presided over the Philadelphia Convention and made the US, which was weakened by the Articles of Confederation, much stronger through the new US Constitution.
The recipient of 34 electoral votes, John Adams of Massachusetts, finished second in voting and was elected Vice President.
In the absence of conventions, there was no formal nomination process. The framers of Constitution had presumed that Washington would be the first President, and once he agreed to come out of retirement to accept the office, there was no opposition to him. Individual states chose their electors, who voted all together for Washington when they met.
Electors used their second vote to cast a scattering of votes, many voting for someone besides Adams less out of opposition to him than to prevent Adams from matching Washington's total.
Only ten states out of the original thirteen cast electoral votes in this election. North Carolina and Rhode Island were ineligible to participate as they had not yet ratified the United States Constitution. New York failed to appoint its allotment of eight electors because of a deadlock in the state legislature.
The total popular vote was 38,818. (info from Wikipedia)
Before this time, the US had no Presidential office but instead gave limited president-like powers to the Chairman of Congress, a position like the current Speaker of the House or President of the Senate.
George Washington had little opposition for election as President. Under the system then in place, each voting elector cast two votes, and the recipient of the greatest number of votes was elected President, providing they equaled or exceeded half the total number of electors.
The runner-up became Vice President. At that time, the Twelfth Amendment to the United States Constitution had not been passed and thus the electoral system for that era differs from most elections. Washington was now very popular, as he successfully presided over the Philadelphia Convention and made the US, which was weakened by the Articles of Confederation, much stronger through the new US Constitution.
The recipient of 34 electoral votes, John Adams of Massachusetts, finished second in voting and was elected Vice President.
In the absence of conventions, there was no formal nomination process. The framers of Constitution had presumed that Washington would be the first President, and once he agreed to come out of retirement to accept the office, there was no opposition to him. Individual states chose their electors, who voted all together for Washington when they met.
Electors used their second vote to cast a scattering of votes, many voting for someone besides Adams less out of opposition to him than to prevent Adams from matching Washington's total.
Only ten states out of the original thirteen cast electoral votes in this election. North Carolina and Rhode Island were ineligible to participate as they had not yet ratified the United States Constitution. New York failed to appoint its allotment of eight electors because of a deadlock in the state legislature.
The total popular vote was 38,818. (info from Wikipedia)
Thursday, January 24, 2008
2003: first US soldier killed in Iraq war
The first US serviceman killed in combat in Iraq was not a citizen of the country for which he sacrificed his life.
Lance Cpl. Jose Gutierrez, 22, a rifleman with the Marines, died in a firefight March 21, 2003 near Umm Qasr. Born in Guatemala, Gutierrez held permanent US resident status, which he obtained in 1999.
At 14, with his parents dead, Gutierrez followed the path of 700,000 of his countrymen to California. He made the 2,000-mile journey on 14 freight trains to get through Mexico. He had no entry papers and US border authorities detained him.
Fernando Castillo, Guatemala’s consul general in Los Angeles, says the United States doesn’t deport Guatemalan minors who arrive without family. Gutierrez was made a ward of Los Angeles Juvenile Court. He was placed in a series of group homes and foster families. He learned English and finished high school.
When he reached 18, he got residency documents, Castillo said.
Marcelo Mosquera, a machinist from Ecuador, and his wife, Nora, were the last couple that sheltered the teenager. They cared for two younger foster children, as well, at their home in Lomita. Neighbors said that Gutierrez acted as the big brother.
Gutierrez talked of becoming an architect but put college plans on hold to join the Marine Corps. Jackie Baker, the Mosqueras’ adult daughter, said that Gutierrez “wanted to give the United States what the United States gave to him. He came with nothing. This country gave him everything.”
As of yesterday, 3,930 American servicemen and women, and about 88,000 civilians have died in the Iraq war.
(info from USA Today, Iraq Body Count, Dep't of Defense)
Lance Cpl. Jose Gutierrez, 22, a rifleman with the Marines, died in a firefight March 21, 2003 near Umm Qasr. Born in Guatemala, Gutierrez held permanent US resident status, which he obtained in 1999.
At 14, with his parents dead, Gutierrez followed the path of 700,000 of his countrymen to California. He made the 2,000-mile journey on 14 freight trains to get through Mexico. He had no entry papers and US border authorities detained him.
Fernando Castillo, Guatemala’s consul general in Los Angeles, says the United States doesn’t deport Guatemalan minors who arrive without family. Gutierrez was made a ward of Los Angeles Juvenile Court. He was placed in a series of group homes and foster families. He learned English and finished high school.
When he reached 18, he got residency documents, Castillo said.
Marcelo Mosquera, a machinist from Ecuador, and his wife, Nora, were the last couple that sheltered the teenager. They cared for two younger foster children, as well, at their home in Lomita. Neighbors said that Gutierrez acted as the big brother.
Gutierrez talked of becoming an architect but put college plans on hold to join the Marine Corps. Jackie Baker, the Mosqueras’ adult daughter, said that Gutierrez “wanted to give the United States what the United States gave to him. He came with nothing. This country gave him everything.”
As of yesterday, 3,930 American servicemen and women, and about 88,000 civilians have died in the Iraq war.
(info from USA Today, Iraq Body Count, Dep't of Defense)
Wednesday, January 23, 2008
1970: first woman enters McSorley's bar
In 1970, Barbara Shaum was the first woman to step into McSorley's Old Ale House in Manhattan, a legendary East Village fixture dating from the 19th century, on the day a new city law forced it to admit women for the first time.
Renowned as a men-only bastion with sawdust covered floors and well-worn furnishings, and a target of those fighting the exclusion of women, the saloon succumbed to an ordinance that banned discrimination against women in all public places in the city.
Shaum's leather-goods shop was then two doors down from McSorley's, and she was friendly with its staff even though she was active in the women's movement. The bar's manager invited her to be the first woman through its doors, soon after the law ended the men-only policy. She entered just ahead of another local shopkeeper, Sara Penn.
The forced end of the bar's men-only policy was really "a small thing" for women, Ms. Shaum said. "Equal pay is more important." (info from The New York Times)
Renowned as a men-only bastion with sawdust covered floors and well-worn furnishings, and a target of those fighting the exclusion of women, the saloon succumbed to an ordinance that banned discrimination against women in all public places in the city.
Shaum's leather-goods shop was then two doors down from McSorley's, and she was friendly with its staff even though she was active in the women's movement. The bar's manager invited her to be the first woman through its doors, soon after the law ended the men-only policy. She entered just ahead of another local shopkeeper, Sara Penn.
The forced end of the bar's men-only policy was really "a small thing" for women, Ms. Shaum said. "Equal pay is more important." (info from The New York Times)
Tuesday, January 22, 2008
1984: first Mac computer
The original Apple Macintosh was released 24 years ago this week, on January 24, 1984. It was the first commercially successful personal computer to feature a mouse and a graphical user interface (GUI) rather than a command line interface. It followed the similar, more expensive, unsuccessful Apple Lisa computer.
The idea for a personal computer appropriate for the ordinary consumer dates to the late 1970s and an Apple development team was established in 1979. After the success of the original Macintosh in 1984, the company quickly established market share only to see it dissipate in the 1990s as Microsoft came to monopolize personal computing.
Apple consolidated multiple, consumer-level desktop models into the 1998 iMac, which sold extremely well and saw the brand name revitalized. Current Mac systems are mainly targeted at the home, education, and creative professional markets. They are the upgraded iMac and the entry-level Mac mini desktop models, the workstation-level Mac Pro tower, the MacBook, MacBook Air and MacBook Pro laptops, and the Xserve server.
Production of the Mac is based on a vertical integration model in that Apple facilitates all aspects of its hardware and creates its own operating system that is pre-installed on all Macs. Apple exclusively produces Mac hardware, choosing internal systems, designs, and prices. Apple does use third party components, however; current Macintosh CPUs use Intel's x86 architecture (formerly the AIM alliance's PowerPC and originally Motorola's 68k). Apple also develops the operating system for Macs, currently Mac OS X 10.5 "Leopard".
This is in contrast to most IBM compatible PCs, where multiple vendors create hardware intended to run another company's software. The modern Mac, like other personal computers, is capable of running alternative operating systems such as Linux, FreeBSD, and Microsoft Windows, considered to be the Mac's biggest competitor. (info and photo from Wikipedia)
The idea for a personal computer appropriate for the ordinary consumer dates to the late 1970s and an Apple development team was established in 1979. After the success of the original Macintosh in 1984, the company quickly established market share only to see it dissipate in the 1990s as Microsoft came to monopolize personal computing.
Apple consolidated multiple, consumer-level desktop models into the 1998 iMac, which sold extremely well and saw the brand name revitalized. Current Mac systems are mainly targeted at the home, education, and creative professional markets. They are the upgraded iMac and the entry-level Mac mini desktop models, the workstation-level Mac Pro tower, the MacBook, MacBook Air and MacBook Pro laptops, and the Xserve server.
Production of the Mac is based on a vertical integration model in that Apple facilitates all aspects of its hardware and creates its own operating system that is pre-installed on all Macs. Apple exclusively produces Mac hardware, choosing internal systems, designs, and prices. Apple does use third party components, however; current Macintosh CPUs use Intel's x86 architecture (formerly the AIM alliance's PowerPC and originally Motorola's 68k). Apple also develops the operating system for Macs, currently Mac OS X 10.5 "Leopard".
This is in contrast to most IBM compatible PCs, where multiple vendors create hardware intended to run another company's software. The modern Mac, like other personal computers, is capable of running alternative operating systems such as Linux, FreeBSD, and Microsoft Windows, considered to be the Mac's biggest competitor. (info and photo from Wikipedia)
Monday, January 21, 2008
2008: first online presidential primary
This year, for the first time, expatriate Democrats can cast their ballots on the Internet in a presidential primary for people living outside the United States.
Democrats Abroad, an official branch of the party representing overseas voters, will hold its first global presidential preference primary from Feb. 5 to 12, with ex-pats selecting the candidate of their choice via the Internet as well as fax, mail and in-person at polling places in more than 100 countries.
US citizens wanting to vote online must join Democrats Abroad before Feb. 1 and indicate their preference to vote by Internet instead of in the local primaries wherever they last lived in the United States. They must promise not to vote twice for president, but can still participate in non-presidential local elections.
Members get a personal identification number from Everyone Counts Inc., the San Diego-based company running the online election. They can then use the number to log in and cast their ballots.
Their votes will be represented at the August Democratic National Convention by 22 delegates, who according to party rules get half a vote each for a total of 11. That's more than US territories get, but fewer than the least populous states, Wyoming and Alaska, which get 18 delegate votes each.
Everyone Counts has been building elections software for a decade, running the British Labor Party's online voting since 2000 and other British elections since 2003, chief executive officer Lori Steele said.
Online voting may give absentee voters more assurance that their ballots are being counted, since confirmation is not available in some counties. The Everyone Counts software even lets voters print out a receipt, unlike most electronic voting machines now in use in many states.
Some 6 million Americans living abroad are eligible to vote in US elections, but only a fraction do so. Until recently, the only option was to mail absentee ballot request forms to the last US county of residence, then wait in hopes that shaky mail systems would deliver the ballots in time to vote.
The system is so unreliable that of 992,034 ballots requested from overseas for the 2006 general election, only 330,000 were cast or counted, and 70 percent of those not counted were returned to elections officials as undeliverable, the US Election Assistance Commission found.
Republicans Abroad has operated independently of the Republican Party since 2003, and therefore can't hold in-person or Internet votes abroad. But it is organizing to get more overseas Republicans registered back home before the primaries, Executive Director Cynthia Dillon said.
Republican votes from overseas could be more decisive because even small margins can make a difference in their winner-take-all state primaries. The Democrats divide primary votes proportionally, assigning delegates according to each leading candidate's share.
"In the Republican primary, the overseas vote could actually have a bigger impact: That vote could be the tipping vote, so to speak, that decides an election in a close race," said Steven Hill, an elections expert who directs the New America Foundation's Political Reform Program.
With so many states having moved up their primary dates, overseas voters should hurry up and register no matter how they plan on voting, Hill said. "These compressed timetables really make it difficult." (info from The Associated Press)
Democrats Abroad, an official branch of the party representing overseas voters, will hold its first global presidential preference primary from Feb. 5 to 12, with ex-pats selecting the candidate of their choice via the Internet as well as fax, mail and in-person at polling places in more than 100 countries.
US citizens wanting to vote online must join Democrats Abroad before Feb. 1 and indicate their preference to vote by Internet instead of in the local primaries wherever they last lived in the United States. They must promise not to vote twice for president, but can still participate in non-presidential local elections.
Members get a personal identification number from Everyone Counts Inc., the San Diego-based company running the online election. They can then use the number to log in and cast their ballots.
Their votes will be represented at the August Democratic National Convention by 22 delegates, who according to party rules get half a vote each for a total of 11. That's more than US territories get, but fewer than the least populous states, Wyoming and Alaska, which get 18 delegate votes each.
Everyone Counts has been building elections software for a decade, running the British Labor Party's online voting since 2000 and other British elections since 2003, chief executive officer Lori Steele said.
Online voting may give absentee voters more assurance that their ballots are being counted, since confirmation is not available in some counties. The Everyone Counts software even lets voters print out a receipt, unlike most electronic voting machines now in use in many states.
Some 6 million Americans living abroad are eligible to vote in US elections, but only a fraction do so. Until recently, the only option was to mail absentee ballot request forms to the last US county of residence, then wait in hopes that shaky mail systems would deliver the ballots in time to vote.
The system is so unreliable that of 992,034 ballots requested from overseas for the 2006 general election, only 330,000 were cast or counted, and 70 percent of those not counted were returned to elections officials as undeliverable, the US Election Assistance Commission found.
Republicans Abroad has operated independently of the Republican Party since 2003, and therefore can't hold in-person or Internet votes abroad. But it is organizing to get more overseas Republicans registered back home before the primaries, Executive Director Cynthia Dillon said.
Republican votes from overseas could be more decisive because even small margins can make a difference in their winner-take-all state primaries. The Democrats divide primary votes proportionally, assigning delegates according to each leading candidate's share.
"In the Republican primary, the overseas vote could actually have a bigger impact: That vote could be the tipping vote, so to speak, that decides an election in a close race," said Steven Hill, an elections expert who directs the New America Foundation's Political Reform Program.
With so many states having moved up their primary dates, overseas voters should hurry up and register no matter how they plan on voting, Hill said. "These compressed timetables really make it difficult." (info from The Associated Press)
Friday, January 18, 2008
1909 or 1969: first man reaches the North Pole
Sir Wally Herbert was the first man to walk across the icebound Arctic Ocean and, some contend, the first to reach the North Pole on foot, a feat long credited to Rear Adm. Robert E. Peary.
“It seemed like conquering a horizontal Everest,” Sir Wally said of the 3,620-mile trek across treacherous ice floes that ended May 30, 1969. He led a four-man team on the 476-day expedition from Alaska to Norway.
On April 4, 1969 — 407 days into the journey — the team stopped at the North Pole, planted a Union Jack and ate beef stew from supplies hauled there by its 40 sled dogs. “It was too cold and too windy to hold any other celebrations,” Sir Wally radioed to London.
Sixty years earlier, on April 6, 1909, Admiral Peary was reported to be the first man to reach the pole on foot. The news went out to the world five months later when Admiral Peary and his team arrived at Indian Harbor, Labrador, and sent a wire to The New York Times, which had exclusive rights to the story. The message read in part, “I have the Pole, April Sixth.”
That claim has been debated. In 1973, Dennis Rawlins, an astronomer, wrote a book, “Peary at the North Pole: Fact or Fiction?” in which he calculated that Admiral Peary had missed the pole by about 60 miles. Mr. Rawlins called the admiral’s claim a “navigational fantasy.”
Then, in 1985, Sir Wally, who wrote nine books on polar exploration, was invited to examine Admiral Peary’s diary and astronomical observations. The documents had not been made public since 1911.
In September 1988, the National Geographic Society, which had sponsored Admiral Peary’s expedition, published an article by Sir Wally in its magazine detailing navigational errors, suspect distance records and inexplicably blank pages in the admiral’s diary. Drawing on new knowledge of Arctic Ocean weather, currents and ice drift, he concluded that those factors and navigational mistakes had left Admiral Peary 30 to 60 miles from the pole.
Sir Wally was particularly concerned that Admiral Peary’s handwritten diary offered no record of his 30 hours near the pole. Several pages were blank, and the entry for April 6 made no mention of the pole. Instead, a loose leaf had been inserted, declaring, “The Pole at last!!!” Whether Peary actually made it to the pole, Sir Wally wrote, “can never be anything more than a probability.”
In 1989, however, the National Geographic Society commissioned the Navigation Foundation, a professional society, to examine the evidence. Based on an analysis of photographs, celestial sightings, ocean depth readings and other records, the foundation, in a 230-page report, concluded that Peary’s final camp had been within five miles of the pole. (info from The New York Times)
“It seemed like conquering a horizontal Everest,” Sir Wally said of the 3,620-mile trek across treacherous ice floes that ended May 30, 1969. He led a four-man team on the 476-day expedition from Alaska to Norway.
On April 4, 1969 — 407 days into the journey — the team stopped at the North Pole, planted a Union Jack and ate beef stew from supplies hauled there by its 40 sled dogs. “It was too cold and too windy to hold any other celebrations,” Sir Wally radioed to London.
Sixty years earlier, on April 6, 1909, Admiral Peary was reported to be the first man to reach the pole on foot. The news went out to the world five months later when Admiral Peary and his team arrived at Indian Harbor, Labrador, and sent a wire to The New York Times, which had exclusive rights to the story. The message read in part, “I have the Pole, April Sixth.”
That claim has been debated. In 1973, Dennis Rawlins, an astronomer, wrote a book, “Peary at the North Pole: Fact or Fiction?” in which he calculated that Admiral Peary had missed the pole by about 60 miles. Mr. Rawlins called the admiral’s claim a “navigational fantasy.”
Then, in 1985, Sir Wally, who wrote nine books on polar exploration, was invited to examine Admiral Peary’s diary and astronomical observations. The documents had not been made public since 1911.
In September 1988, the National Geographic Society, which had sponsored Admiral Peary’s expedition, published an article by Sir Wally in its magazine detailing navigational errors, suspect distance records and inexplicably blank pages in the admiral’s diary. Drawing on new knowledge of Arctic Ocean weather, currents and ice drift, he concluded that those factors and navigational mistakes had left Admiral Peary 30 to 60 miles from the pole.
Sir Wally was particularly concerned that Admiral Peary’s handwritten diary offered no record of his 30 hours near the pole. Several pages were blank, and the entry for April 6 made no mention of the pole. Instead, a loose leaf had been inserted, declaring, “The Pole at last!!!” Whether Peary actually made it to the pole, Sir Wally wrote, “can never be anything more than a probability.”
In 1989, however, the National Geographic Society commissioned the Navigation Foundation, a professional society, to examine the evidence. Based on an analysis of photographs, celestial sightings, ocean depth readings and other records, the foundation, in a 230-page report, concluded that Peary’s final camp had been within five miles of the pole. (info from The New York Times)
Thursday, January 17, 2008
1895: first boob job
Known as breast augmentation, breast enlargement, mammoplasty enlargement, augmentation mammoplasty or boob job, a breast implant is a prosthesis used to enlarge the size of a woman's breasts. It's done for cosmetic reasons; to reconstruct the breast (e.g. after a mastectomy; or to correct genetic deformities), or as part of male-to-female sex reassignment surgery.
According to the American Society of Plastic Surgeons, breast augmentation is the most commonly performed cosmetic surgical procedure in the US. In 2006, 329,000 breast augmentation procedures were performed in the US.
There are two primary types of breast implants: saline-filled and silicone-gel-filled implants. Saline implants have a silicone elastomer shell filled with sterile saline liquid. Silicone gel implants have a silicone shell filled with a viscous silicone gel.
The earliest known implant was attempted by Austrian-German surgeon Vincenz Czerny, using a woman's own adipose tissue (from a benign growth, on her back).
Csech surgeon Robert Gersuny tried paraffin injections in 1889, with disastrous results. Subsequently, in the early to mid-1900s, a number of other substances were tried, including ivory, glass balls, ground rubber, ox cartilage, Terylene wool, gutta-percha, Dicora, polyethylene chips, polyvinyl alcohol-formaldehyde polymer sponge (Ivalon), Ivalon in a polyethylene sac, polyether foam sponge, polyethylene tape or strips wound into a ball, polyurethane foam sponge, silastic rubber, and teflon-silicone prostheses.
In recent history, various creams and medications have been used in attempts to increase bust size. Doctors in 1945 and 1950 performed a flap-based augmentation by rotating the patient's chest wall tissue into the breast to add volume.
Various synthetics were used throughout the 1950s and 1960s, including silicone injections, which an estimated 50,000 women received.
Development of silicone granulomas (inflamed tumors) and hardening of the breasts were in some cases so severe that women needed to have mastectomies for treatment. Women sometimes seek medical treatment for complications up to 30 years after receiving this type of injection.
According to the American Society of Plastic Surgeons, breast augmentation is the most commonly performed cosmetic surgical procedure in the US. In 2006, 329,000 breast augmentation procedures were performed in the US.
There are two primary types of breast implants: saline-filled and silicone-gel-filled implants. Saline implants have a silicone elastomer shell filled with sterile saline liquid. Silicone gel implants have a silicone shell filled with a viscous silicone gel.
The earliest known implant was attempted by Austrian-German surgeon Vincenz Czerny, using a woman's own adipose tissue (from a benign growth, on her back).
Csech surgeon Robert Gersuny tried paraffin injections in 1889, with disastrous results. Subsequently, in the early to mid-1900s, a number of other substances were tried, including ivory, glass balls, ground rubber, ox cartilage, Terylene wool, gutta-percha, Dicora, polyethylene chips, polyvinyl alcohol-formaldehyde polymer sponge (Ivalon), Ivalon in a polyethylene sac, polyether foam sponge, polyethylene tape or strips wound into a ball, polyurethane foam sponge, silastic rubber, and teflon-silicone prostheses.
In recent history, various creams and medications have been used in attempts to increase bust size. Doctors in 1945 and 1950 performed a flap-based augmentation by rotating the patient's chest wall tissue into the breast to add volume.
Various synthetics were used throughout the 1950s and 1960s, including silicone injections, which an estimated 50,000 women received.
Development of silicone granulomas (inflamed tumors) and hardening of the breasts were in some cases so severe that women needed to have mastectomies for treatment. Women sometimes seek medical treatment for complications up to 30 years after receiving this type of injection.
Wednesday, January 16, 2008
1955: Salk polio vaccine
Jonas Edward Salk (1914 – 1995) was an American biologist and physician best known for the research and development of the first effective polio vaccine.
In 1947, Salk joined the University of Pittsburgh, as head of the Virus Research lab. Though he continued his research on improving the influenza vaccine, he set his sights on the poliomyelitis (polio) virus.
At that time, it was believed that immunity can come only after the body has survived at least a mild infection by live virus. In contrast, Salk observed that it is possible to acquire immunity through contact with inactivated (killed) virus.
Using formaldehyde, Salk killed the polio virus, but kept it intact enough to trigger the necessary immune response. Salk's research caught the attention of Basil O'Connor, president of the National Foundation for Infantile Paralysis (now known as the March of Dimes Birth Defects Foundation). The organization decided to fund Salk's efforts to develop a killed virus vaccine.
The vaccine was first tested in monkeys, and then in patients at the D.T. Watson Home for Crippled Children. After successful tests, in 1952, Salk tested his vaccine on volunteering parties, including himself, the laboratory staff, his wife, and his children.
In 1954, national testing began on two million children, ages six to nine, who became known as the Polio Pioneers. This was one of the first double-blind placebo-controlled tests, which has since become standard: half of the treated received the vaccine, and half received a placebo, where neither the individuals nor the researchers know who belongs to the control group and the experimental group.
One-third of the children, who lived in areas where vaccine was not available, were observed in order to evaluate the background level of polio in this age group. On April 12, 1955, the results were announced: the vaccine was safe and effective. The patient would develop immunity to the live disease due to the body's earlier reaction to the killed virus.
Salk's vaccine was instrumental in beginning the eradication of polio, a once widely-feared disease. Polio epidemics in 1916 left about 6000 dead and 27,000 paralyzed in the United States. In 1952, 57,628 cases were recorded in the U.S. After the vaccine became available, polio cases in the US dropped by 85-90 percent in only two years.
However, the live-virus oral vaccine developed by Albert Sabin became the preferred alternative after a sometimes intense clash between the two scientists and their adherents.
The Salk vaccine, which is injected, proved to be effective in sharply reducing the number of polio cases in the United States. One disadvantage to the Salk vaccine was that booster shots had to be taken periodically. But the Sabin vaccine had the advantage of easier delivery and became accepted in the United States after the testing abroad. It was licensed in 1961 and eventually became the vaccine of choice in most parts of the world. The last indigenous case of polio in the US was reported in 1991. Partly because of that fact, only inactivated, Salk-type polio vaccines have been recommended for use in the United States since 2000. (info from Wikipedia)
In 1947, Salk joined the University of Pittsburgh, as head of the Virus Research lab. Though he continued his research on improving the influenza vaccine, he set his sights on the poliomyelitis (polio) virus.
At that time, it was believed that immunity can come only after the body has survived at least a mild infection by live virus. In contrast, Salk observed that it is possible to acquire immunity through contact with inactivated (killed) virus.
Using formaldehyde, Salk killed the polio virus, but kept it intact enough to trigger the necessary immune response. Salk's research caught the attention of Basil O'Connor, president of the National Foundation for Infantile Paralysis (now known as the March of Dimes Birth Defects Foundation). The organization decided to fund Salk's efforts to develop a killed virus vaccine.
The vaccine was first tested in monkeys, and then in patients at the D.T. Watson Home for Crippled Children. After successful tests, in 1952, Salk tested his vaccine on volunteering parties, including himself, the laboratory staff, his wife, and his children.
In 1954, national testing began on two million children, ages six to nine, who became known as the Polio Pioneers. This was one of the first double-blind placebo-controlled tests, which has since become standard: half of the treated received the vaccine, and half received a placebo, where neither the individuals nor the researchers know who belongs to the control group and the experimental group.
One-third of the children, who lived in areas where vaccine was not available, were observed in order to evaluate the background level of polio in this age group. On April 12, 1955, the results were announced: the vaccine was safe and effective. The patient would develop immunity to the live disease due to the body's earlier reaction to the killed virus.
Salk's vaccine was instrumental in beginning the eradication of polio, a once widely-feared disease. Polio epidemics in 1916 left about 6000 dead and 27,000 paralyzed in the United States. In 1952, 57,628 cases were recorded in the U.S. After the vaccine became available, polio cases in the US dropped by 85-90 percent in only two years.
However, the live-virus oral vaccine developed by Albert Sabin became the preferred alternative after a sometimes intense clash between the two scientists and their adherents.
The Salk vaccine, which is injected, proved to be effective in sharply reducing the number of polio cases in the United States. One disadvantage to the Salk vaccine was that booster shots had to be taken periodically. But the Sabin vaccine had the advantage of easier delivery and became accepted in the United States after the testing abroad. It was licensed in 1961 and eventually became the vaccine of choice in most parts of the world. The last indigenous case of polio in the US was reported in 1991. Partly because of that fact, only inactivated, Salk-type polio vaccines have been recommended for use in the United States since 2000. (info from Wikipedia)
Tuesday, January 15, 2008
1967: first Super Bowl
In professional American football, the Super Bowl is the championship game of the National Football League (NFL). It and its associated festivities constitute Super Bowl Sunday, which has become the most-watched US television broadcast of the year. Super Bowl Sunday is the second-largest US food consumption day, following Thanksgiving.
Many popular singers and musicians have performed during the Super Bowl's pre-game and halftime ceremonies -- some with notoriety, such as in 2004, when Janet Jackson's nipple was exposed by Justin Timberlake in what was referred to as a "wardrobe malfunction".
The Super Bowl was first played on January 15, 1967 as part of an agreement between the NFL and its younger rival, the American Football League (AFL) in which each league's championship team would play each other in an "AFL-NFL World Championship Game". After the leagues merged in 1970, the Super Bowl became the NFL's championship game.
The Super Bowl uses Roman numerals to identify each game, rather than the year it was held, since the NFL season extends beyond New Year's Eve. For example, the Indianapolis Colts, winners of Super Bowl XLI are the champions of the 2006 season, even though the championship game was played in February 2007.
After its inception in 1920, the NFL fended off several rival leagues before the AFL began play in 1960. The intense competitive war for players and fans led to serious merger talks between the two leagues in 1966, culminating in a merger announcement in 1966.
One of the conditions of the Merger was that the winners of each league's championship game would meet in a contest to determine the "world champion of football".
According to NFL Films President Steve Sabol, then NFL Commissioner Pete Rozelle wanted to call the game "The Big One". During the discussions to iron out the details, AFL founder and Kansas City Chiefs owner Lamar Hunt had jokingly referred to the proposed interleague championship as the "Super Bowl". Hunt thought of the name after seeing his kids playing with a toy called a Super Ball. The ball is now on display at the Pro Football Hall of Fame in Canton, Ohio.
The name was consistent with postseason college football games which had long been known as "bowl games". The "bowl" term originated from the Rose Bowl Game, which was in turn named for the bowl-shaped stadium in which it is played. Hunt only meant his suggested name to be a stopgap until a better one could be found. Nevertheless, the name "Super Bowl" became permanent.
After the NFL's Green Bay Packers convincingly won the first two Super Bowls, some team owners feared for the future of the merger. At the time, many doubted the competitiveness of AFL teams compared with NFL counterparts. That perception all changed with one of the biggest upsets in sports history, the AFL's New York Jets defeat of the Baltimore Colts in Super Bowl III in Miami. One year later, the AFL's Kansas City Chiefs defeated the NFL Minnesota Vikings 23-7 and won Super Bowl IV in New Orleans, the last World Championship game played between the champions of the two leagues. These first four Super Bowls were actually AFL-NFL World Championships at the time. After the merger, they were redesignated as Super Bowls I through IV.
The game is played annually on a Sunday as the final game of the NFL Playoffs. Originally the game took place in early to mid-January following a 14-game regular season and playoffs. The game now takes place in late January or even the first Sunday in February, due to the current 17-week schedule.
The winning team gets the Vince Lombardi Trophy, named for the coach of the Green Bay Packers, who won the first two Super Bowl games and 3 of the 5 preceding NFL championships.
Super Bowl XLII will be the 42nd annual edition of the Super Bowl. The game is scheduled to be played following the 2007 regular season on February 3, 2008 at University of Phoenix Stadium in Glendale, Arizona. (info from Wikipedia)
Editor's Note: I have never watched a Super Bowl Game. Football Sucks. It is only slightly less sleep-inducing than golf, or watching paint dry.
Many popular singers and musicians have performed during the Super Bowl's pre-game and halftime ceremonies -- some with notoriety, such as in 2004, when Janet Jackson's nipple was exposed by Justin Timberlake in what was referred to as a "wardrobe malfunction".
The Super Bowl was first played on January 15, 1967 as part of an agreement between the NFL and its younger rival, the American Football League (AFL) in which each league's championship team would play each other in an "AFL-NFL World Championship Game". After the leagues merged in 1970, the Super Bowl became the NFL's championship game.
The Super Bowl uses Roman numerals to identify each game, rather than the year it was held, since the NFL season extends beyond New Year's Eve. For example, the Indianapolis Colts, winners of Super Bowl XLI are the champions of the 2006 season, even though the championship game was played in February 2007.
After its inception in 1920, the NFL fended off several rival leagues before the AFL began play in 1960. The intense competitive war for players and fans led to serious merger talks between the two leagues in 1966, culminating in a merger announcement in 1966.
One of the conditions of the Merger was that the winners of each league's championship game would meet in a contest to determine the "world champion of football".
According to NFL Films President Steve Sabol, then NFL Commissioner Pete Rozelle wanted to call the game "The Big One". During the discussions to iron out the details, AFL founder and Kansas City Chiefs owner Lamar Hunt had jokingly referred to the proposed interleague championship as the "Super Bowl". Hunt thought of the name after seeing his kids playing with a toy called a Super Ball. The ball is now on display at the Pro Football Hall of Fame in Canton, Ohio.
The name was consistent with postseason college football games which had long been known as "bowl games". The "bowl" term originated from the Rose Bowl Game, which was in turn named for the bowl-shaped stadium in which it is played. Hunt only meant his suggested name to be a stopgap until a better one could be found. Nevertheless, the name "Super Bowl" became permanent.
After the NFL's Green Bay Packers convincingly won the first two Super Bowls, some team owners feared for the future of the merger. At the time, many doubted the competitiveness of AFL teams compared with NFL counterparts. That perception all changed with one of the biggest upsets in sports history, the AFL's New York Jets defeat of the Baltimore Colts in Super Bowl III in Miami. One year later, the AFL's Kansas City Chiefs defeated the NFL Minnesota Vikings 23-7 and won Super Bowl IV in New Orleans, the last World Championship game played between the champions of the two leagues. These first four Super Bowls were actually AFL-NFL World Championships at the time. After the merger, they were redesignated as Super Bowls I through IV.
The game is played annually on a Sunday as the final game of the NFL Playoffs. Originally the game took place in early to mid-January following a 14-game regular season and playoffs. The game now takes place in late January or even the first Sunday in February, due to the current 17-week schedule.
The winning team gets the Vince Lombardi Trophy, named for the coach of the Green Bay Packers, who won the first two Super Bowl games and 3 of the 5 preceding NFL championships.
Super Bowl XLII will be the 42nd annual edition of the Super Bowl. The game is scheduled to be played following the 2007 regular season on February 3, 2008 at University of Phoenix Stadium in Glendale, Arizona. (info from Wikipedia)
Editor's Note: I have never watched a Super Bowl Game. Football Sucks. It is only slightly less sleep-inducing than golf, or watching paint dry.
Monday, January 14, 2008
2008: death of world's greatest eater
Eddie "Bozo" Miller, who was once listed by the Guinness Book of World Records as the "world's greatest trencherman" (glutton) died Jan. 7 at age 89.
He once won a contest by eating 30 pounds of elk and moose meatloaf. He boasted of downing 25 bowls of minestrone and 30 pounds of shrimp, and drinking a whole bottle of gin in a single chug on a bet, then offering to buy the loser a drink. His Guinness records -- in categories no longer recorded in the book -- were for eating 27 two-pound pullets of chicken, and for downing 324 raviolis, each at single sittings in 1963. Guinness noted that he downed the first 250 raviolis in 70 minutes. There was a delay while the restaurant restocked.
Miller in his prime was a jolly ball of a man, 5 feet 7 inches tall and 57 inches at the waist, making him, wrote one of his many friends in the press, almost as easy to walk over as to walk around. He was born to San Francisco vaudevillians and eventually became one of Oakland's most prominent men about town, driving a bright-yellow Cadillac with boxes of perfume and pearls in the trunk as presents for the ladies.
He had as many as 11 meals a day and 25,000 calories. Rejecting a few smaller creatures -- rabbit, duck, snails -- he chewed a mighty swath through the East Bay. And he boasted of seldom suffering indigestion, except once, in 1942, when a snack of 10 pounds of cheese crackers made him ill. He said he often sank a dozen martinis before his first lunch of the day.
"His hobby was getting people drunk," says Steve Blackman, his son-in-law. Weekly dinners at his home that he held for guests, often including Hollywood royalty, challenged each diner to ingest ludicrous quantities of cuisine cooked by the host. After dinner, his three daughters -- Candy, Cooky, and Honey -- would present a selection of more than 100 liqueurs. On the wall was a framed slogan from his hero, W.C. Fields: "Nothing exceeds like excess." Mr. Miller's other tastes tended to the extreme as well. He boasted that he owned 10,000 records.
A competitive spirit drove him, he said. Too often in periods when he tried to diet, someone would challenge his gustatory supremacy and he would suddenly find himself downing 25 7-Ups or three steaks. "A friend will say, 'Hey Boz, eat a jar of white horseradish for my wife,'" he told the Los Angeles Times in 1972. "I can't say no."
Miller liked to imagine that he would have outeaten Diamond Jim Brady (1856-1917), the Gilded Age financier and gourmand. "I understand he was strong, mighty strong in the meat department," Miller told the Fresno Beein 1944. "But he was vulnerable in pastry. Me, I have no weaknesses."
He likewise regretted not being in his prime to take on Takeru Kobayashi, the terror of the Coney Island boardwalk who has dominated hot-dog-eating contests in recent years. "He said he would have ate him under the table," recalls Blackman.
George Shea, chairman of the International Federation of Competitive Eating, says the organization will have a moment of silence to honor Mr. Miller at its Nathan's Famous Hot Dog Eating Contest in Coney Island this July 4.
After owning and managing bars in the 1940s, Miller became a liquor distributor. By the late 1970s, the advice of his latest doctors (previous ones predeceased him, he declared) and the death of a daughter began to limit his hunger. He slimmed down to about 175 pounds from more than 300 at his peak.
Sometimes suspected of stretching the truth about his achievements as a gurgitator, he habitually exaggerated his age by a decade. Although he wasn't quite into his nineties, as he claimed, it was still no mean feat for him to sit down with an Oakland Tribune reporter last month to take on a French-dip sandwich. Miller complained of having no appetite and quipped of his funeral, "They're going to stuff me." (info from The Wall Street Journal, photo from NBC11.com)
He once won a contest by eating 30 pounds of elk and moose meatloaf. He boasted of downing 25 bowls of minestrone and 30 pounds of shrimp, and drinking a whole bottle of gin in a single chug on a bet, then offering to buy the loser a drink. His Guinness records -- in categories no longer recorded in the book -- were for eating 27 two-pound pullets of chicken, and for downing 324 raviolis, each at single sittings in 1963. Guinness noted that he downed the first 250 raviolis in 70 minutes. There was a delay while the restaurant restocked.
Miller in his prime was a jolly ball of a man, 5 feet 7 inches tall and 57 inches at the waist, making him, wrote one of his many friends in the press, almost as easy to walk over as to walk around. He was born to San Francisco vaudevillians and eventually became one of Oakland's most prominent men about town, driving a bright-yellow Cadillac with boxes of perfume and pearls in the trunk as presents for the ladies.
He had as many as 11 meals a day and 25,000 calories. Rejecting a few smaller creatures -- rabbit, duck, snails -- he chewed a mighty swath through the East Bay. And he boasted of seldom suffering indigestion, except once, in 1942, when a snack of 10 pounds of cheese crackers made him ill. He said he often sank a dozen martinis before his first lunch of the day.
"His hobby was getting people drunk," says Steve Blackman, his son-in-law. Weekly dinners at his home that he held for guests, often including Hollywood royalty, challenged each diner to ingest ludicrous quantities of cuisine cooked by the host. After dinner, his three daughters -- Candy, Cooky, and Honey -- would present a selection of more than 100 liqueurs. On the wall was a framed slogan from his hero, W.C. Fields: "Nothing exceeds like excess." Mr. Miller's other tastes tended to the extreme as well. He boasted that he owned 10,000 records.
A competitive spirit drove him, he said. Too often in periods when he tried to diet, someone would challenge his gustatory supremacy and he would suddenly find himself downing 25 7-Ups or three steaks. "A friend will say, 'Hey Boz, eat a jar of white horseradish for my wife,'" he told the Los Angeles Times in 1972. "I can't say no."
Miller liked to imagine that he would have outeaten Diamond Jim Brady (1856-1917), the Gilded Age financier and gourmand. "I understand he was strong, mighty strong in the meat department," Miller told the Fresno Beein 1944. "But he was vulnerable in pastry. Me, I have no weaknesses."
He likewise regretted not being in his prime to take on Takeru Kobayashi, the terror of the Coney Island boardwalk who has dominated hot-dog-eating contests in recent years. "He said he would have ate him under the table," recalls Blackman.
George Shea, chairman of the International Federation of Competitive Eating, says the organization will have a moment of silence to honor Mr. Miller at its Nathan's Famous Hot Dog Eating Contest in Coney Island this July 4.
After owning and managing bars in the 1940s, Miller became a liquor distributor. By the late 1970s, the advice of his latest doctors (previous ones predeceased him, he declared) and the death of a daughter began to limit his hunger. He slimmed down to about 175 pounds from more than 300 at his peak.
Sometimes suspected of stretching the truth about his achievements as a gurgitator, he habitually exaggerated his age by a decade. Although he wasn't quite into his nineties, as he claimed, it was still no mean feat for him to sit down with an Oakland Tribune reporter last month to take on a French-dip sandwich. Miller complained of having no appetite and quipped of his funeral, "They're going to stuff me." (info from The Wall Street Journal, photo from NBC11.com)
Friday, January 11, 2008
2005: Spain permits gay marriages
In July of 2005, Ramón Vizcaíno and Luis Ibarcena became one of the first gay couples to seek government authorization to wed under Spain's new marriage law.
"This means we are no longer second-class citizens," Vizcaíno said. "We have always had the same obligations as other citizens. We deserve the same rights, too."
The lines inside the Madrid Civil Registry, where residents apply for marriage licenses, swelled with gay and lesbian couples for the first time, after Parliament passed a law giving same-sex couples the right to marry and to adopt children.
The vote made Spain the first nation to remove all legal distinctions between same-sex and heterosexual unions, said advocates for marriage rights for gay couples. Belgium, Canada and the Netherlands had previously legalized gay marriage, but only Canada's laws, which did not yet apply to all of the country, contained language as liberal as Spain's.
Parliament's decision to legalize gay marriage provoked tremendous animosity among religious conservatives in predominantly Roman Catholic Spain.
In a speech before Pope Benedict XVI in Rome, the archbishop of Madrid, Cardinal Antonio María Rouco Varela, condemned the law, saying it was evidence of a society in which "not only is faith denied, but also human reason itself."
Ricardo Blázquez, the president of the Conference of Catholic Bishops, also denounced the law, saying that it "throws moral and human order into confusion."
Many gay couples said they had grown up in Catholic households but were no longer practicing Catholics, in part because of the church's opposition to gay marriage. Some still attended church regularly. (info from The New York Times)
"This means we are no longer second-class citizens," Vizcaíno said. "We have always had the same obligations as other citizens. We deserve the same rights, too."
The lines inside the Madrid Civil Registry, where residents apply for marriage licenses, swelled with gay and lesbian couples for the first time, after Parliament passed a law giving same-sex couples the right to marry and to adopt children.
The vote made Spain the first nation to remove all legal distinctions between same-sex and heterosexual unions, said advocates for marriage rights for gay couples. Belgium, Canada and the Netherlands had previously legalized gay marriage, but only Canada's laws, which did not yet apply to all of the country, contained language as liberal as Spain's.
Parliament's decision to legalize gay marriage provoked tremendous animosity among religious conservatives in predominantly Roman Catholic Spain.
In a speech before Pope Benedict XVI in Rome, the archbishop of Madrid, Cardinal Antonio María Rouco Varela, condemned the law, saying it was evidence of a society in which "not only is faith denied, but also human reason itself."
Ricardo Blázquez, the president of the Conference of Catholic Bishops, also denounced the law, saying that it "throws moral and human order into confusion."
Many gay couples said they had grown up in Catholic households but were no longer practicing Catholics, in part because of the church's opposition to gay marriage. Some still attended church regularly. (info from The New York Times)
Thursday, January 10, 2008
1984: last state votes for booze in bars
After a quarter century of illegal drinking -- what cynics called "liquor by the wink" -- in 1984 Oklahoma prepared to accept what it had been doing all along and voted to allow legal sale of liquor by the drink.
The third attempt in a dozen years to make it legal to buy a drink in a bar or restaurant was approved by 425,772 to 396,986, or 52 to 48 percent.
In two statewide votes on the same issue, in 1972 and 1976, constitutional amendments to allow sale of liquor by the drink were defeated by substantial margins. A third attempt, in 1980, never made it to the ballot stage because of a legal challenge. A similar effort to block the vote by legal means was made this time, but a court turned it aside.
The vote did not directly permit sale of liquor by the drink, but instead authorized the Legislature to draft enabling legislation to allow each of the state's 77 counties to conduct a separate referendum on the issue.
The vote was on an amendment to replace a provision in the Oklahoma Constitution that banned sale of any alcohol by the drink except for beer with an alcohol content no higher than 3.2 percent.
Even as voters were going to the polls Tuesday, "private clubs," served drinks throughout the day. Under law, such clubs are allowed to serve patrons from the patron's own bottle, purportedly bought at a state-licensed liquor store. But in practice the customer simply buys his drink and pays for it, thus "liquor-by-the-wink."
Proponents of legalizing barroom sales used the slogan, "Let's Be Honest," while opponents based their appeal on the dangers alcohol poses to health and society, particularly for drunken driving.
However, that argument appeared to have been blunted by statistics showing the state, under the present system, already ranks sixth in the nation in the number of highway deaths per capita, and 10th in arrests per capita for drunken driving.
Oklahoma was the last state to repeal prohibition, in 1959, and had never in its history as a state allowed open saloons.
The passage of the referendum, despite defeats in the past, may have been due as much to demographic changes as changes in attitude. Since the last vote in 1976, some 600,000 new voters have been added to the rolls, most either young or newcomers to the state from places where liquor laws are more liberal.
Also, Oklahoma had undergone a dramatic economic reversal because of the sharp decline in oil and gas revenues. Thus revenues from legal taxed sale of liquor by the drink and also from the increased tourism the change is expected to encourage, became increasingly important. (info from The New York Times)
The third attempt in a dozen years to make it legal to buy a drink in a bar or restaurant was approved by 425,772 to 396,986, or 52 to 48 percent.
In two statewide votes on the same issue, in 1972 and 1976, constitutional amendments to allow sale of liquor by the drink were defeated by substantial margins. A third attempt, in 1980, never made it to the ballot stage because of a legal challenge. A similar effort to block the vote by legal means was made this time, but a court turned it aside.
The vote did not directly permit sale of liquor by the drink, but instead authorized the Legislature to draft enabling legislation to allow each of the state's 77 counties to conduct a separate referendum on the issue.
The vote was on an amendment to replace a provision in the Oklahoma Constitution that banned sale of any alcohol by the drink except for beer with an alcohol content no higher than 3.2 percent.
Even as voters were going to the polls Tuesday, "private clubs," served drinks throughout the day. Under law, such clubs are allowed to serve patrons from the patron's own bottle, purportedly bought at a state-licensed liquor store. But in practice the customer simply buys his drink and pays for it, thus "liquor-by-the-wink."
Proponents of legalizing barroom sales used the slogan, "Let's Be Honest," while opponents based their appeal on the dangers alcohol poses to health and society, particularly for drunken driving.
However, that argument appeared to have been blunted by statistics showing the state, under the present system, already ranks sixth in the nation in the number of highway deaths per capita, and 10th in arrests per capita for drunken driving.
Oklahoma was the last state to repeal prohibition, in 1959, and had never in its history as a state allowed open saloons.
The passage of the referendum, despite defeats in the past, may have been due as much to demographic changes as changes in attitude. Since the last vote in 1976, some 600,000 new voters have been added to the rolls, most either young or newcomers to the state from places where liquor laws are more liberal.
Also, Oklahoma had undergone a dramatic economic reversal because of the sharp decline in oil and gas revenues. Thus revenues from legal taxed sale of liquor by the drink and also from the increased tourism the change is expected to encourage, became increasingly important. (info from The New York Times)
Wednesday, January 9, 2008
1963: first pole vault over 17 feet
In an event that had seen the world record advance less than six inches, to the 16-foot level, over the previous 20 years, and only then after aluminum poles had replaced bamboo ones, John Thomas Pennel was in the vanguard of a group of pioneering athletes who transformed the sport in the early 1960's.
Their weapon was the new fiberglass pole, and their impact on the sport was decisive.
In one five-month span in 1963, Pennel, a senior at Northeast Louisiana State, personally added more than nine inches to the record, beginning with a 16-foot-3-inch vault at the Memphis Relays on March 23 and culminating with his benchmark-shattering vault of 17 feet, 3/4 inches at the Gold Coast meet at the University of Miami on Aug. 24.
His stunning series of achievements in 1963 made Pennel, who won the year's Sullivan Award as the nation's top amateur athlete, a favorite to win the gold medal at the 1964 Olympics in Tokyo, but a back injury six weeks before the Games cost him a chance for Olympic glory. He finished 11th, with a height of 15-5, as his American teammate, Fred Hansen, set an Olympic record with a vault of 16-8 3/4.
Hansen, who advanced the world record to 17-3 3/4 during 1964, and Bob Seagren, who increased it to 17-5 two years later, dominated the sport for a while after that, but it was Pennel who took the record to 17-6, in 1966.
Two years later, with Seagren now his main rival, Pennel was again a favorite win an Olympic gold, but he was knocked out of the competition in Mexico City and relegated to a fifth-place finish when his pole fell under the bar on a vault of 17-8 1/2 that would have clinched a bronze medal and kept him in competition for the gold, which was captured by Seagren.
The next year, Pennel set his eighth and last world record, 17-10 1/4, more than a foot and a half above his original mark. A series of injuries led him to end his career in 1970, the year Christos Papanicolaou of Greece became the first man to clear 18 feet.
Pennel later worked in sports marketing for Adidas and others, and made television commercials. He died in 1993 at age 53, from cancer. (photo from Viewimages, info from The New York Times)
Their weapon was the new fiberglass pole, and their impact on the sport was decisive.
In one five-month span in 1963, Pennel, a senior at Northeast Louisiana State, personally added more than nine inches to the record, beginning with a 16-foot-3-inch vault at the Memphis Relays on March 23 and culminating with his benchmark-shattering vault of 17 feet, 3/4 inches at the Gold Coast meet at the University of Miami on Aug. 24.
His stunning series of achievements in 1963 made Pennel, who won the year's Sullivan Award as the nation's top amateur athlete, a favorite to win the gold medal at the 1964 Olympics in Tokyo, but a back injury six weeks before the Games cost him a chance for Olympic glory. He finished 11th, with a height of 15-5, as his American teammate, Fred Hansen, set an Olympic record with a vault of 16-8 3/4.
Hansen, who advanced the world record to 17-3 3/4 during 1964, and Bob Seagren, who increased it to 17-5 two years later, dominated the sport for a while after that, but it was Pennel who took the record to 17-6, in 1966.
Two years later, with Seagren now his main rival, Pennel was again a favorite win an Olympic gold, but he was knocked out of the competition in Mexico City and relegated to a fifth-place finish when his pole fell under the bar on a vault of 17-8 1/2 that would have clinched a bronze medal and kept him in competition for the gold, which was captured by Seagren.
The next year, Pennel set his eighth and last world record, 17-10 1/4, more than a foot and a half above his original mark. A series of injuries led him to end his career in 1970, the year Christos Papanicolaou of Greece became the first man to clear 18 feet.
Pennel later worked in sports marketing for Adidas and others, and made television commercials. He died in 1993 at age 53, from cancer. (photo from Viewimages, info from The New York Times)
Tuesday, January 8, 2008
2008: 32GB SD memory card
It wasn't too long ago that one gigabyte was a BIG hard drive, and 128 kilobytes was a BIG memory card.
Here at the International Consumer Electronics Show in Las Vegas, Panasonic has upped the SD memory card ante by announcing one with 32 gig capacity. It's said to be the world's first 32gigger with Class 6 speed specification (up to 20 MB per second).
Storage capacity is about eight hours of hi-def video, or umpteen gazillion photographs. Price has not yet been announced.
Here at the International Consumer Electronics Show in Las Vegas, Panasonic has upped the SD memory card ante by announcing one with 32 gig capacity. It's said to be the world's first 32gigger with Class 6 speed specification (up to 20 MB per second).
Storage capacity is about eight hours of hi-def video, or umpteen gazillion photographs. Price has not yet been announced.
Monday, January 7, 2008
2007: US music sales drop,
even including downloads
US "album" sales plunged 15% last year from 2006, as the recording industry marked another weak year of sales despite a 45% surge in the sale of digital tracks.
A total of 500.5 million albums were purchased as CDs, cassettes, LPs and other formats last year, down from 588 million in 2006, said Nielsen SoundScan, which tracks point-of-purchase sales.
The shortfall in album sales drops to 9.5% when sales of digital singles are counted as 10-track equivalent albums. The number of digital tracks sold jumped 45% to 844.2 million; digital album sales accounted for 10% of total album purchases.
For the first time since SoundScan started tracking genre sales, all 12 genres dropped, with rap down 30% and country more than 16%.
Overall music purchases, including albums, singles, digital tracks and music videos, rose to 1.35 billion units, up 14% from 2006.
The recording industry has seen CD album sales decline for years, in part because of the rise of online file-sharing, but also as consumers have spent more of their leisure dollars on other entertainment purchases, such as DVDs and video games.
Warner Music Group Corp. artist Josh Groban had the best-selling album with "Noel." The album, a collection of Christmas songs, sold around 3.7 million copies. A soundtrack for The Walt Disney Co.'s popular "High School Musical" franchise was second with around 2.9 million units sold.
Among last year's other top-selling albums were a "Hannah Montana" soundtrack and offerings from the Eagles, Alicia Keys and Fergie.
Three out of the five top-selling albums for the year were released late in the fourth quarter.
The major recording companies' album market share remained nearly the same, with Vivendi SA's Universal Music Group holding a 31.9% share, up slightly from the previous year. Sony BMG Music Entertainment, a joint venture of Sony Corp. and Bertelsmann AG, continued to rank second with 24.97%, though it dropped 2.4% from 2006.
One trend that should prove encouraging to record labels: 50 million albums were downloaded last year, a 53% uptick.
"That says consumers are embracing both the track format and the digital album format," said Rob Sisco, president of Nielsen Music. In all, 23% of music sales were derived from digital purchases, Mr. Sisco said.
The holiday season brought an upswell of music purchases, with music sales in the last week of the year totaling 58.4 million units, the biggest sales week ever recorded by Nielsen SoundScan.
David Pakman, chief executive of eMusic.com Inc., attributed strong holiday sales at the online music retailer in part to an apparent pick up in sales of low-cost digital music players.
"That's showing us that digital music adoption is reaching into some price-sensitive areas," Mr. Pakman said. (info from The Associated Press)
A total of 500.5 million albums were purchased as CDs, cassettes, LPs and other formats last year, down from 588 million in 2006, said Nielsen SoundScan, which tracks point-of-purchase sales.
The shortfall in album sales drops to 9.5% when sales of digital singles are counted as 10-track equivalent albums. The number of digital tracks sold jumped 45% to 844.2 million; digital album sales accounted for 10% of total album purchases.
For the first time since SoundScan started tracking genre sales, all 12 genres dropped, with rap down 30% and country more than 16%.
Overall music purchases, including albums, singles, digital tracks and music videos, rose to 1.35 billion units, up 14% from 2006.
The recording industry has seen CD album sales decline for years, in part because of the rise of online file-sharing, but also as consumers have spent more of their leisure dollars on other entertainment purchases, such as DVDs and video games.
Warner Music Group Corp. artist Josh Groban had the best-selling album with "Noel." The album, a collection of Christmas songs, sold around 3.7 million copies. A soundtrack for The Walt Disney Co.'s popular "High School Musical" franchise was second with around 2.9 million units sold.
Among last year's other top-selling albums were a "Hannah Montana" soundtrack and offerings from the Eagles, Alicia Keys and Fergie.
Three out of the five top-selling albums for the year were released late in the fourth quarter.
The major recording companies' album market share remained nearly the same, with Vivendi SA's Universal Music Group holding a 31.9% share, up slightly from the previous year. Sony BMG Music Entertainment, a joint venture of Sony Corp. and Bertelsmann AG, continued to rank second with 24.97%, though it dropped 2.4% from 2006.
One trend that should prove encouraging to record labels: 50 million albums were downloaded last year, a 53% uptick.
"That says consumers are embracing both the track format and the digital album format," said Rob Sisco, president of Nielsen Music. In all, 23% of music sales were derived from digital purchases, Mr. Sisco said.
The holiday season brought an upswell of music purchases, with music sales in the last week of the year totaling 58.4 million units, the biggest sales week ever recorded by Nielsen SoundScan.
David Pakman, chief executive of eMusic.com Inc., attributed strong holiday sales at the online music retailer in part to an apparent pick up in sales of low-cost digital music players.
"That's showing us that digital music adoption is reaching into some price-sensitive areas," Mr. Pakman said. (info from The Associated Press)
Friday, January 4, 2008
2007: Toyota beats Ford in the US
Toyota overtook Ford to become the No. 2 automaker by US sales in 2007, breaking Ford's 76-year lock on the position.
Toyota sold 2.62 million cars and trucks in 2007, 48,226 more than Ford, according to sales figures released Thursday. Toyota's sales were up 3 percent for the year, buoyed by new products like the Tundra pickup, which saw sales jump 57 percent. Ford's sales fell 12 percent to 2.572 million vehicles.
General Motors remained the US sales leader, selling 3.82 million vehicles in 2007. That was down 6 percent from the previous year as customers turned away from some large sedans and sport utility vehicles and GM cut low-profit sales to employees and rental car agencies. GM's car sales fell 8 percent for the year while truck sales were down 4 percent.
Ford's car sales plummeted 24 percent for all of 2007 as some models like the Ford Mustang aged and a new Ford Taurus sedan was unable to match the volumes of the older version. Ford also cut rental-car sales by 32 percent over the year. Truck sales were down 5 percent.
Ford corporate historian Bob Kreipke said it was the first time since 1931 that Ford wasn't second behind GM in US sales. (info from The Associated Press)
Toyota sold 2.62 million cars and trucks in 2007, 48,226 more than Ford, according to sales figures released Thursday. Toyota's sales were up 3 percent for the year, buoyed by new products like the Tundra pickup, which saw sales jump 57 percent. Ford's sales fell 12 percent to 2.572 million vehicles.
General Motors remained the US sales leader, selling 3.82 million vehicles in 2007. That was down 6 percent from the previous year as customers turned away from some large sedans and sport utility vehicles and GM cut low-profit sales to employees and rental car agencies. GM's car sales fell 8 percent for the year while truck sales were down 4 percent.
Ford's car sales plummeted 24 percent for all of 2007 as some models like the Ford Mustang aged and a new Ford Taurus sedan was unable to match the volumes of the older version. Ford also cut rental-car sales by 32 percent over the year. Truck sales were down 5 percent.
Ford corporate historian Bob Kreipke said it was the first time since 1931 that Ford wasn't second behind GM in US sales. (info from The Associated Press)
Thursday, January 3, 2008
2007: half of Americans have digital TVs
As of the end of 2007, more than 50 percent of US households owned digital televisions, according to new research released by the Consumer Electronics Association. As the nation transitions to digital television, consumers are buying DTVs at a record pace.
According to sales projections, manufacturers will post 11 percent revenue growth, to over $25 billion, from sales of digital televisions in 2007. CEA also forecasts 13 percent revenue and 17 percent unit sales growth for digital television in 2008.
According to sales projections, manufacturers will post 11 percent revenue growth, to over $25 billion, from sales of digital televisions in 2007. CEA also forecasts 13 percent revenue and 17 percent unit sales growth for digital television in 2008.
Wednesday, January 2, 2008
1971: Dan Cooper bails out of hijacked plane
The FBI is making a new effort to identify mysterious skyjacker Dan Cooper, who bailed out of an airliner in 1971 and vanished. The man, also known as D.B. Cooper, boarded a Northwest flight in Portland for Seattle on Nov. 24, 1971, and commandeered the plane, claiming he had dynamite.
In Seattle, he demanded and got $200,000 and four parachutes and demanded to be flown to Mexico. Somewhere over southwestern Washington, he jumped out the plane's rear exit with two chutes.
The FBI said that while Cooper was originally thought to have been an experienced jumper, it has since concluded that was wrong and that he almost certainly didn't survive the jump in the dark and rain. He hadn't specified a route for the plane to fly and had no way of knowing where he was when he went out the exit.
"Diving into the wilderness without a plan, without the right equipment, in such terrible conditions, he probably never even got his chute open," agent Larry Carr said. He also didn't notice that his reserve chute was intended only for training and was sewn shut.
Several people have claimed to be Cooper but were dismissed. In 1980, a boy walking near the Columbia River found $5,800 of the stolen money.
"Maybe a hydrologist can use the latest technology to trace the $5,800 in ransom money found in 1980 to where Cooper landed upstream," Carr said. "Or maybe someone just remembers that odd uncle." (info from The Associated Press)
In Seattle, he demanded and got $200,000 and four parachutes and demanded to be flown to Mexico. Somewhere over southwestern Washington, he jumped out the plane's rear exit with two chutes.
The FBI said that while Cooper was originally thought to have been an experienced jumper, it has since concluded that was wrong and that he almost certainly didn't survive the jump in the dark and rain. He hadn't specified a route for the plane to fly and had no way of knowing where he was when he went out the exit.
"Diving into the wilderness without a plan, without the right equipment, in such terrible conditions, he probably never even got his chute open," agent Larry Carr said. He also didn't notice that his reserve chute was intended only for training and was sewn shut.
Several people have claimed to be Cooper but were dismissed. In 1980, a boy walking near the Columbia River found $5,800 of the stolen money.
"Maybe a hydrologist can use the latest technology to trace the $5,800 in ransom money found in 1980 to where Cooper landed upstream," Carr said. "Or maybe someone just remembers that odd uncle." (info from The Associated Press)