A federal judge Thursday refused a request from the nation's last operating horse slaughterhouse to remain open.
Cavel International's facility in DeKalb, IL was set to close at midnight Friday, when a temporary court order allowing it to stay open was to expire. The plant, about 60 miles west of Chicago, is the last in the United States that slaughters horses for human consumption. Except for a portion sold to US zoos, the meat is shipped for overseas' diners.
"Obviously we're disappointed with the ruling," said Calabrese, adding that the company could file an appeal as soon as Friday.
In late May, Gov. Rod Blagojevich signed a law banning the import, export, possession and slaughter of horses intended for human consumption — forcing the Cavel plant to close for about a week. The company immediately challenged the state law in federal court. U.S. District Court Judge Frederick J. Kapala granted a temporary order in early June that prevented officials from enforcing the ban. Kapala extended the order once, but refused Thursday to allow Cavel to stay open longer.
He wrote that he "no longer believes that plaintiffs have shown a strong or even negligible likelihood of succeeding on the merits of the action pending before this court."
Kapala hasn't ruled on Cavel's original challenge. He said he will not do so until a related matter — whether the Humane Society of the United States can be a party in the case — goes through the courts. "Our primary reaction is we'll wait and see what the next step is," said Ann Spillane, chief of staff for the state attorney general. "There are obviously further legal proceedings that are going to happen."
The Cavel plant has operated in DeKalb for about 20 years and slaughters about 1,000 horses a week, according to plant officials. Cavel lawyers say the Illinois law violates the interstate and foreign commerce clauses of the U.S. Constitution. They argue the plant's closure would deprive about 55 people of jobs. Supporters say without horse slaughterhouses, more older or otherwise marginalized horses would be neglected or abandoned because some owners won't pay the cost to have them euthanized.
Critics says the slaughterhouse process is inhumane. Some also argue the nation has no tradition of raising horses for meat, and shouldn't be doing so to satisfy foreign consumers. (info from the Associated Press)
Friday, June 29, 2007
Thursday, June 28, 2007
1892: first sneakers
The sneaker is a relatively modern invention with its roots in the Industrial Revolution. With new materials like vulcanized rubber and new production methods like assembly lines, shoes could now be manufactured more cheaply and efficiently where once each shoe had to be produced by hand by a shoemaker.
The early part of the 20th century witnessed the birth of many of the familiar sneaker brands, but sneakers stayed the domain of athletes until Hollywood picked up on the fashion, first in the 1930's and then again in the 1950's when teen icon, James Dean, was photographed wearing jeans, a t-shirt, and sneakers. From then on, these cheap, durable shoes became part of the official uniform of kids around the world.
The following timeline will illustrate the technological and cultural history of the sneaker.
1800's: The first rubber-soled shoes, called plimsolls, were manufactured
1892: Goodyear, then a rubber shoe company and division of the U.S. Rubber Company, begins to manufacture rubber and canvas shoes under different names, finally settling on Keds.
1908: Marquis M. Converse establishes Converse shoe company, revolutionizing the game of basketball and becoming an American icon.
1917: Keds are the first mass marketed athletic shoes. These shoes are later called sneakers by Henry Nelson McKinney, an advertising agent for N. W. Ayer & Son, because the soles are quiet and make no noise on any surface.
1917: Converse releases the world's first performance basketball shoe, the Converse All Star.
1920: Adi Dassler, founder of Adidas, begins producing handmade training shoes in his mother's washroom without electricity
1923: The All Star gives way to the Chuck Taylor All Star, a staple of basketball players, kids, and rebels for more than 50 years. Also known as Chucks, Cons, Connies, more than 744 million pairs have been sold in 144 countries.
1924: Adi and Rudolph Dassler, with the help of some 50 family members, register their business as Gebr der Dassler Schuhfabrik in Herzogenaurach, Germany.
1931: Adidas produces its first tennis shoe.
1933: Canvas footwear pioneer BF Goodrich patented the Posture Foundation insole, an innovation in comfort and performance, and began adding the new technology to its action shoes. Goodrich shoes with Posture Foundation became known simply as "P-F" in 1937.
1935: Converse releases the Jack Purcell with its telltale “Smile” on the front. They became a staple of early Hollywood and the bad boy crowd, but remained famous longer after the 1930's badminton/tennis champion, Jack Purcell, had faded into history. (Editor's Note: Jack Purcell sneakers were probably the first premium sneaker, selling for $8.98 a pair in the mid 1960s, when normal sneakers sold for $5.98. At the time, they were the only sneaker made in different widths. I got a 50-cent "spiff" for each pair I sold.)
1948: Puma Schuhfabrik Rudolf Dassler is founded and the world is introduced to the PUMA Atom, PUMA's first football shoe worn by members of the West German football team.
1950's: Sneakers were the preferred footwear of teenagers and the symbol of rebellion. These cheap and easily obtained shoes were worn by students around the world. In the US, cheerleaders wore sweaters, short skirts, and ankle socks with Keds. The fashion got a boost when James Dean was photographed wearing Levis jeans and white sneakers.
In the mid part of the 20th century, the sneaker became a more common cultural phenomenon with emphasis being put on new technologies for athletes, and athletes such as Michael Jordan were paid to endorse expensive sneakers, some selling for over $100 per pair. Sneakers became a status symbol, and theft became a problem in inner cities. In 1968, Puma was the first sports shoe manufacture to offer Velcro fasteners. In 1969, Quarterback Joe Namath, wearing Pumas, led the New York Jets to victory in SuperBowl III.
The latter part of the 20th century and the first part of the 21st century were all about celebrity endorsements and limited editions. Nike' released retro editions of the classic Air Jordans and continued to release new models. In 1971, the Nike Swoosh trademark was purchased from a graphic design student for $35. Bill Bowerman, father of Nike Inc. died in December 1999, but Nike kept going strong to release a revolution cushioning system called Nike Shox. Reebok, in a bid to strengthen their sales, formed a partnership with various music artists to create their Sound and Rhythm line. In 2003, Nike acquired sneaker pioneer Converse.
(info from SneakerHead.com, PFFlyers.com,
The early part of the 20th century witnessed the birth of many of the familiar sneaker brands, but sneakers stayed the domain of athletes until Hollywood picked up on the fashion, first in the 1930's and then again in the 1950's when teen icon, James Dean, was photographed wearing jeans, a t-shirt, and sneakers. From then on, these cheap, durable shoes became part of the official uniform of kids around the world.
The following timeline will illustrate the technological and cultural history of the sneaker.
1800's: The first rubber-soled shoes, called plimsolls, were manufactured
1892: Goodyear, then a rubber shoe company and division of the U.S. Rubber Company, begins to manufacture rubber and canvas shoes under different names, finally settling on Keds.
1908: Marquis M. Converse establishes Converse shoe company, revolutionizing the game of basketball and becoming an American icon.
1917: Keds are the first mass marketed athletic shoes. These shoes are later called sneakers by Henry Nelson McKinney, an advertising agent for N. W. Ayer & Son, because the soles are quiet and make no noise on any surface.
1917: Converse releases the world's first performance basketball shoe, the Converse All Star.
1920: Adi Dassler, founder of Adidas, begins producing handmade training shoes in his mother's washroom without electricity
1923: The All Star gives way to the Chuck Taylor All Star, a staple of basketball players, kids, and rebels for more than 50 years. Also known as Chucks, Cons, Connies, more than 744 million pairs have been sold in 144 countries.
1924: Adi and Rudolph Dassler, with the help of some 50 family members, register their business as Gebr der Dassler Schuhfabrik in Herzogenaurach, Germany.
1931: Adidas produces its first tennis shoe.
1933: Canvas footwear pioneer BF Goodrich patented the Posture Foundation insole, an innovation in comfort and performance, and began adding the new technology to its action shoes. Goodrich shoes with Posture Foundation became known simply as "P-F" in 1937.
1935: Converse releases the Jack Purcell with its telltale “Smile” on the front. They became a staple of early Hollywood and the bad boy crowd, but remained famous longer after the 1930's badminton/tennis champion, Jack Purcell, had faded into history. (Editor's Note: Jack Purcell sneakers were probably the first premium sneaker, selling for $8.98 a pair in the mid 1960s, when normal sneakers sold for $5.98. At the time, they were the only sneaker made in different widths. I got a 50-cent "spiff" for each pair I sold.)
1948: Puma Schuhfabrik Rudolf Dassler is founded and the world is introduced to the PUMA Atom, PUMA's first football shoe worn by members of the West German football team.
1950's: Sneakers were the preferred footwear of teenagers and the symbol of rebellion. These cheap and easily obtained shoes were worn by students around the world. In the US, cheerleaders wore sweaters, short skirts, and ankle socks with Keds. The fashion got a boost when James Dean was photographed wearing Levis jeans and white sneakers.
In the mid part of the 20th century, the sneaker became a more common cultural phenomenon with emphasis being put on new technologies for athletes, and athletes such as Michael Jordan were paid to endorse expensive sneakers, some selling for over $100 per pair. Sneakers became a status symbol, and theft became a problem in inner cities. In 1968, Puma was the first sports shoe manufacture to offer Velcro fasteners. In 1969, Quarterback Joe Namath, wearing Pumas, led the New York Jets to victory in SuperBowl III.
The latter part of the 20th century and the first part of the 21st century were all about celebrity endorsements and limited editions. Nike' released retro editions of the classic Air Jordans and continued to release new models. In 1971, the Nike Swoosh trademark was purchased from a graphic design student for $35. Bill Bowerman, father of Nike Inc. died in December 1999, but Nike kept going strong to release a revolution cushioning system called Nike Shox. Reebok, in a bid to strengthen their sales, formed a partnership with various music artists to create their Sound and Rhythm line. In 2003, Nike acquired sneaker pioneer Converse.
(info from SneakerHead.com, PFFlyers.com,
Wednesday, June 27, 2007
2007: death of last Iwo Jima flag-raiser
The small island of Iwo Jima is 660 miles south of Tokyo. One of its outstanding geographical features is Mount Suribachi, an extinct volcano that forms the narrow southern tip of the island and rises 550 feet to dominate the area.
During World War 2, by February 1945, US troops had recaptured most of the territory taken by the Japanese in 1941 and 1942; but Iwo Jima had not been recaptured, and was a primary objective in American plans to conclude the Pacific campaign.
On the morning of February 19, 1945, the 4th and 5th Marine Divisions invaded Iwo Jima after a somewhat ineffective bombardment lasting 72 hours. The 28th Regiment, 5th Division, was ordered to capture Mount Suribachi. They reached the base of the mountain on the afternoon of February 21, and by nightfall the next day had almost completely surrounded it.
On the morning of February 23, Marines of Company E, 2nd Battalion, started the tortuous climb up the rough terrain to the top. At about 10:30 a.m., men all over the island were thrilled by the sight of a small American flag flying from atop Mount Suribachi.
That afternoon, when the slopes were clear of enemy resistance, a second, larger flag was raised by five Marines and a Navy hospital corpsman: Sgt. Michael Strank, Cpl. Harlon H. Block, Pfc. Franklin R. Sousley, Pfc. Rene A. Gagnon, Pfc. Ira Hayes, and PhM. 2/c John H. Bradley, USN.
Photographer Joe Rosenthal caught the afternoon flag raising in an inspiring Pulitzer Prize winning photograph. When the picture was later released, sculptor Felix W. de Weldon, then on duty with the US Navy, was so moved by the scene that he constructed a scale model and then a life-size model of it.
Gagnon, Hayes, and Bradley, the three survivors of the flag raising (the others having been killed in later phases of the Iwo Jima battle), posed for the sculptor who modeled their faces in clay. All available pictures and physical statistics of the three who had given their lives were collected and then used in the modeling of their faces.
Erection of the memorial, which was designed by Horace W. Peaslee, was begun in September 1954. It was officially dedicated by President Dwight D. Eisenhower on November 10, 1954, the 179th anniversary of the U.S. Marine Corps.
Of the over 22,000 Japanese soldiers, 20,703 died and 216 were captured. The Allied forces suffered 27,909 casualties, with 6,825 killed in action. The number of American casualties were greater than the total Allied casualties on D-Day. Iwo Jima was also the only US Marine battle where the American casualties exceeded the Japanese. As all the civilians had been evacuated, there was not one civilian casualty.
The last known surviving flag-raiser, Charles W. Lindberg, who helped put up the first flag, died Sunday in Minnesota. He was 86.
But there remain lingering disputes over the identity of at least one man in the first flag-raising.
A California veteran of Iwo Jima, Raymond Jacobs, has said he believes he is the man with a radio on his back who had usually been identified as Pfc. Gene Marshall, a radio operator with the 5th Marine Division who died in 1987. The other men involved in the raising all have died.
Though most of the American dead were recovered in 1948, some 250 U.S. troops are still missing from the Iwo Jima campaign. Many were lost at sea, meaning the chances of recovering their remains are slim. But many others died in caves or were buried by explosions.
Japan's government and military are helping with the search on Iwo Jima, which this month was officially renamed Iwo To - the island's name before the war.
(info from National Park Service, Wikipedia, and the Associated Press) (photo of Iwo Jima memorial statue is based on photo of second flag raising by Joe Rosenthal, from National Defense University)
(Editor's remote connections with Iwo Jima flag raising: #1, as a Cub Scout, I helped hold up an American flag in a re-enactment of the historical event. #2, I've stayed at the Iwo Jima Motel, near the memorial in Arlington, VA. It was a bargain, and very convenient to Washington.)
During World War 2, by February 1945, US troops had recaptured most of the territory taken by the Japanese in 1941 and 1942; but Iwo Jima had not been recaptured, and was a primary objective in American plans to conclude the Pacific campaign.
On the morning of February 19, 1945, the 4th and 5th Marine Divisions invaded Iwo Jima after a somewhat ineffective bombardment lasting 72 hours. The 28th Regiment, 5th Division, was ordered to capture Mount Suribachi. They reached the base of the mountain on the afternoon of February 21, and by nightfall the next day had almost completely surrounded it.
On the morning of February 23, Marines of Company E, 2nd Battalion, started the tortuous climb up the rough terrain to the top. At about 10:30 a.m., men all over the island were thrilled by the sight of a small American flag flying from atop Mount Suribachi.
That afternoon, when the slopes were clear of enemy resistance, a second, larger flag was raised by five Marines and a Navy hospital corpsman: Sgt. Michael Strank, Cpl. Harlon H. Block, Pfc. Franklin R. Sousley, Pfc. Rene A. Gagnon, Pfc. Ira Hayes, and PhM. 2/c John H. Bradley, USN.
Photographer Joe Rosenthal caught the afternoon flag raising in an inspiring Pulitzer Prize winning photograph. When the picture was later released, sculptor Felix W. de Weldon, then on duty with the US Navy, was so moved by the scene that he constructed a scale model and then a life-size model of it.
Gagnon, Hayes, and Bradley, the three survivors of the flag raising (the others having been killed in later phases of the Iwo Jima battle), posed for the sculptor who modeled their faces in clay. All available pictures and physical statistics of the three who had given their lives were collected and then used in the modeling of their faces.
Erection of the memorial, which was designed by Horace W. Peaslee, was begun in September 1954. It was officially dedicated by President Dwight D. Eisenhower on November 10, 1954, the 179th anniversary of the U.S. Marine Corps.
Of the over 22,000 Japanese soldiers, 20,703 died and 216 were captured. The Allied forces suffered 27,909 casualties, with 6,825 killed in action. The number of American casualties were greater than the total Allied casualties on D-Day. Iwo Jima was also the only US Marine battle where the American casualties exceeded the Japanese. As all the civilians had been evacuated, there was not one civilian casualty.
The last known surviving flag-raiser, Charles W. Lindberg, who helped put up the first flag, died Sunday in Minnesota. He was 86.
But there remain lingering disputes over the identity of at least one man in the first flag-raising.
A California veteran of Iwo Jima, Raymond Jacobs, has said he believes he is the man with a radio on his back who had usually been identified as Pfc. Gene Marshall, a radio operator with the 5th Marine Division who died in 1987. The other men involved in the raising all have died.
Though most of the American dead were recovered in 1948, some 250 U.S. troops are still missing from the Iwo Jima campaign. Many were lost at sea, meaning the chances of recovering their remains are slim. But many others died in caves or were buried by explosions.
Japan's government and military are helping with the search on Iwo Jima, which this month was officially renamed Iwo To - the island's name before the war.
(info from National Park Service, Wikipedia, and the Associated Press) (photo of Iwo Jima memorial statue is based on photo of second flag raising by Joe Rosenthal, from National Defense University)
(Editor's remote connections with Iwo Jima flag raising: #1, as a Cub Scout, I helped hold up an American flag in a re-enactment of the historical event. #2, I've stayed at the Iwo Jima Motel, near the memorial in Arlington, VA. It was a bargain, and very convenient to Washington.)
Tuesday, June 26, 2007
1913: bra is patented
The first modern brassiere to receive a patent was invented in 1913 by a New York socialite named Mary Phelps Jacob. She had purchased a sheer evening gown for a social event, and at that time, the only acceptable undergarment was a corset stiffened with whale bones.
Mary found that the whale bones poked out visible around the plunging neckline and under the sheer fabric. Two silk handkerchiefs and some pink ribbon later, Mary had designed an alternative to the corset.
An unhealthy and painful device designed to narrow an adult women's waist to as little as 10 inches, the invention of the corset is attributed to Catherine de Médicis, wife of King Henri II of France. She enforced a ban on thick waists at court attendance's (1550's) and started over 350 years of whalebones, steel rods and torture.
Mary Phelps Jacob's new undergarment complimented the new fashions introduced at the time and demands from friends and family were high for the new brassiere. Mary was the first to patent an undergarment named 'Brassiere' derived from the old French word for 'upper arm'. Her patent was for a device that was lightweight, soft and separated the breasts naturally.
Caresse Crosby was the business name Mary Phelps Jacob used for her brassiere production. However, she didn't enjoy running a business, and sold the patent to the Warner Brothers Corset Company in Bridgeport, Connecticut, for $1,500. Warner (the bra-makers, not the movie-makers) made over $15 million from the bra patent over the next thirty years.
Other brassiere history:
In 1875, manufacturers George Frost and George Phelps patented the Union Under-Flannel, a no bones, no eyelets, and no laces or pulleys under-outfit.
In 1893, Marie Tucek patented the breast supporter. The device included separate pockets for the breasts and straps that went over the shoulder, fastened by hook-and-eye closures.
In 1889, corset-maker Herminie Cadolle invented the Well-Being or 'Bien-être', a bra-like device sold as a health aid. The corset's support for the breasts squeezed up from below. Cadolle changed breast support to the shoulders down.
World War I dealt the corset a fatal blow when the US War Industries Board called on women to stop buying corsets in 1917. It freed up some 28,000 tons of metal.
In 1928, a Russian immigrant named Ida Rosenthal founded Maidenform, which pioneered grouping women into cup sizes.
The Bali Brassiere Company was founded by Sam and Sara Stein in 1927. Bali's best-known product has been the WonderBra, an underwired bra with side padding designed to uplift and add cleavage. Bali launched the WonderBra in the US in 1994; but it was invented in 1963 by Canadian designer, Louise Poirier. According to Wonderbra USA, "this unique garment, the forerunner of today's Wonderbra push-up bra had 54 design elements that lifted and supported the bust to create dramatic cleavage. Its precision engineering involved three-part cup construction, precision-angled back and underwire cups, removable pads called cookies, gate back back design for support, and rigid straps."
Frederick's of Hollywood was started by Frederick Mellinger (inventor of the push-up bra) in 1946. The original flagship store was a landmark on Hollywood Boulevard in California. In September 2005, after 59 years, the store moved to a larger, more modern space a few blocks away. The connection Mellinger had with Hollywood led fashion-conscious women to seek out his pointed, cone-stitched bras, sold under brand names such as Missiles. In the sixties, the "Cadillac" bra was launched, and soon became the company's best seller. Other innovations included the front-hook bra, and bras with shoulder pads. By the 70s, when women were burning bras outside Frederick's store, he had enough media sense to proclaim in public that the "law of gravity will win out." It was a publicity coup, and sales of his bras soared.
Victoria's Secret was started in San Francisco in 1977 by Stanford Graduate School of Business alumnus Roy Raymond, who felt embarrassed trying to purchase lingerie for his wife in very public and awkward department store environment. He opened the first store at Stanford Shopping Center, and quickly followed it with a mail order catalog and three other stores. The stores were meant to create a comfortable environment for men, with wood paneled walls, Victorian details, and helpful sales staff. Instead of racks of bras and panties in every size, there were single styles, paired together, and mounted on the wall in frames. Men could browse for styles and then sales staff would help estimate the appropriate size. The company gained notoriety in the early 1990s when it began to use supermodels in advertising and fashion shows. (info from About.com, Lingerie Uncovered, and Wikipedia) (photo from Victoria's Secret)
Mary found that the whale bones poked out visible around the plunging neckline and under the sheer fabric. Two silk handkerchiefs and some pink ribbon later, Mary had designed an alternative to the corset.
An unhealthy and painful device designed to narrow an adult women's waist to as little as 10 inches, the invention of the corset is attributed to Catherine de Médicis, wife of King Henri II of France. She enforced a ban on thick waists at court attendance's (1550's) and started over 350 years of whalebones, steel rods and torture.
Mary Phelps Jacob's new undergarment complimented the new fashions introduced at the time and demands from friends and family were high for the new brassiere. Mary was the first to patent an undergarment named 'Brassiere' derived from the old French word for 'upper arm'. Her patent was for a device that was lightweight, soft and separated the breasts naturally.
Caresse Crosby was the business name Mary Phelps Jacob used for her brassiere production. However, she didn't enjoy running a business, and sold the patent to the Warner Brothers Corset Company in Bridgeport, Connecticut, for $1,500. Warner (the bra-makers, not the movie-makers) made over $15 million from the bra patent over the next thirty years.
Other brassiere history:
In 1875, manufacturers George Frost and George Phelps patented the Union Under-Flannel, a no bones, no eyelets, and no laces or pulleys under-outfit.
In 1893, Marie Tucek patented the breast supporter. The device included separate pockets for the breasts and straps that went over the shoulder, fastened by hook-and-eye closures.
In 1889, corset-maker Herminie Cadolle invented the Well-Being or 'Bien-être', a bra-like device sold as a health aid. The corset's support for the breasts squeezed up from below. Cadolle changed breast support to the shoulders down.
World War I dealt the corset a fatal blow when the US War Industries Board called on women to stop buying corsets in 1917. It freed up some 28,000 tons of metal.
In 1928, a Russian immigrant named Ida Rosenthal founded Maidenform, which pioneered grouping women into cup sizes.
The Bali Brassiere Company was founded by Sam and Sara Stein in 1927. Bali's best-known product has been the WonderBra, an underwired bra with side padding designed to uplift and add cleavage. Bali launched the WonderBra in the US in 1994; but it was invented in 1963 by Canadian designer, Louise Poirier. According to Wonderbra USA, "this unique garment, the forerunner of today's Wonderbra push-up bra had 54 design elements that lifted and supported the bust to create dramatic cleavage. Its precision engineering involved three-part cup construction, precision-angled back and underwire cups, removable pads called cookies, gate back back design for support, and rigid straps."
Frederick's of Hollywood was started by Frederick Mellinger (inventor of the push-up bra) in 1946. The original flagship store was a landmark on Hollywood Boulevard in California. In September 2005, after 59 years, the store moved to a larger, more modern space a few blocks away. The connection Mellinger had with Hollywood led fashion-conscious women to seek out his pointed, cone-stitched bras, sold under brand names such as Missiles. In the sixties, the "Cadillac" bra was launched, and soon became the company's best seller. Other innovations included the front-hook bra, and bras with shoulder pads. By the 70s, when women were burning bras outside Frederick's store, he had enough media sense to proclaim in public that the "law of gravity will win out." It was a publicity coup, and sales of his bras soared.
Victoria's Secret was started in San Francisco in 1977 by Stanford Graduate School of Business alumnus Roy Raymond, who felt embarrassed trying to purchase lingerie for his wife in very public and awkward department store environment. He opened the first store at Stanford Shopping Center, and quickly followed it with a mail order catalog and three other stores. The stores were meant to create a comfortable environment for men, with wood paneled walls, Victorian details, and helpful sales staff. Instead of racks of bras and panties in every size, there were single styles, paired together, and mounted on the wall in frames. Men could browse for styles and then sales staff would help estimate the appropriate size. The company gained notoriety in the early 1990s when it began to use supermodels in advertising and fashion shows. (info from About.com, Lingerie Uncovered, and Wikipedia) (photo from Victoria's Secret)
Monday, June 25, 2007
1878: first telephone switchboard
1878: first telephone directory
1878: first phone in the White House
On January, 28 1878, the first commercial switchboard began operating in New Haven, Connecticut. It served 21 telephones on 8 lines, so many customers had party line service.
On February 21, 1878, the New Haven District Telephone Company published the world's first telephone directory -- a single page with only fifty names, without phone numbers. 129 years later, several of the entities on that first directory are still in business, but now they have phone numbers.
Also, in 1878 the Rutherford B. Hayes administration arranged for the installation of the first telephone in the White House. According to a report, the first call made by President Hayes went to Alexander Graham Bell himself, thirteen miles away. Hayes' first words instructed Bell to speak more slowly. (info from Privateline.com)
On February 21, 1878, the New Haven District Telephone Company published the world's first telephone directory -- a single page with only fifty names, without phone numbers. 129 years later, several of the entities on that first directory are still in business, but now they have phone numbers.
Also, in 1878 the Rutherford B. Hayes administration arranged for the installation of the first telephone in the White House. According to a report, the first call made by President Hayes went to Alexander Graham Bell himself, thirteen miles away. Hayes' first words instructed Bell to speak more slowly. (info from Privateline.com)
Friday, June 22, 2007
1982: Bell System offers non-Touchtone pushbutton phones
The Touchtone phone made its official debut at the 1964 World's Fair in New York City, and the local Bell System companies quickly began promoting and installing them.
The novelty and speed of push-button dialing had great appeal, and the Bell companies were able charge a premium price, despite the fact that Touch-tone calls saved them money. Since calls could be processed faster in the Central Offices, less equipment was needed to process the calls.
Some people refused to pay the extra fee, and some phone companies were slow to upgrade their equipment to recognize the musical tones. Sensing a potential market, several manufacturers developed phones with outpulse dials. They had buttons that looked just like Touchtone buttons, but when you pressed a button, you heard clicks, not beeps.
The outpulse circuitry put out a string of pulses (rapidly disconnecting and reconnecting the phone) when each button was pressed. It wasn't as fast as real Touchtone (it took much longer to dial eight than to dial one); but the phones could be used anywhere, and there was no extra charge.
By about 1980, AT&T operated a national chain of Phone Center Stores, some in malls and some in Bell System Buildings, offering a mixture of for-rent and for-sale phones. Most models came in both Touchtone and rotary-dial versions.
In about 1982, a strange new model appeared on the shelves: The Pac-Man phone.
Pac-Man was an extremely popular video game, first introduced in 1979 in Japan, and quickly spreading around the world. It was an icon of 1980s culture, and remains popular decades later. The image of a chomping yellow face became ubiquitous, appearing on everything from clothing to phones.
The Pac-Man phone was made in Hong Kong, and distributed in the US by American Telecommunications Corp., which supplied the Bell companies with a variety of decorator and novelty phones, including Mickey Mouse, Snoopy and Kermit.
The phone was very popular, but an unreliable piece of crap. It probably had the highest return rate of any phone sold by the Bell System stores -- but it has a place in telecommunications history as Bell's first outpulse phone. (photo from eBay in Australia)
The novelty and speed of push-button dialing had great appeal, and the Bell companies were able charge a premium price, despite the fact that Touch-tone calls saved them money. Since calls could be processed faster in the Central Offices, less equipment was needed to process the calls.
Some people refused to pay the extra fee, and some phone companies were slow to upgrade their equipment to recognize the musical tones. Sensing a potential market, several manufacturers developed phones with outpulse dials. They had buttons that looked just like Touchtone buttons, but when you pressed a button, you heard clicks, not beeps.
The outpulse circuitry put out a string of pulses (rapidly disconnecting and reconnecting the phone) when each button was pressed. It wasn't as fast as real Touchtone (it took much longer to dial eight than to dial one); but the phones could be used anywhere, and there was no extra charge.
By about 1980, AT&T operated a national chain of Phone Center Stores, some in malls and some in Bell System Buildings, offering a mixture of for-rent and for-sale phones. Most models came in both Touchtone and rotary-dial versions.
In about 1982, a strange new model appeared on the shelves: The Pac-Man phone.
Pac-Man was an extremely popular video game, first introduced in 1979 in Japan, and quickly spreading around the world. It was an icon of 1980s culture, and remains popular decades later. The image of a chomping yellow face became ubiquitous, appearing on everything from clothing to phones.
The Pac-Man phone was made in Hong Kong, and distributed in the US by American Telecommunications Corp., which supplied the Bell companies with a variety of decorator and novelty phones, including Mickey Mouse, Snoopy and Kermit.
The phone was very popular, but an unreliable piece of crap. It probably had the highest return rate of any phone sold by the Bell System stores -- but it has a place in telecommunications history as Bell's first outpulse phone. (photo from eBay in Australia)
Thursday, June 21, 2007
1891: first slot machines in New York
1934: end of slot machines in New York
Slot machines started in New York, in 1891. The first were designed by Pitt and Sittman, and had five drums that displayed poker hands. There was no payback mechanism, so the places in New York that bought the machines, paid in prizes of their own, usually free drinks.
Then came Charles Fey. He made the first slot, the Liberty Bell, in his basement. The slot machines didn't get widespread success until years later when they were put in the Flamingo Hilton hotel in the Las Vegas strip.
Fey's first slot machine was different than anything made today. It was over 100 pounds of cast iron, and lacked the fruit symbols common associated with slots. Instead it had star, horseshoes, and suits from playing cards, like diamonds and spades. The Liberty Bell gave a fifty cent payout to winners, which was substantial in its day. The Liberty Belle Saloon and Restaurant in Reno still has the very first Liberty Bell designed by Fey. The establishment is owned by his grandchildren, who preserve Fey's legacy in the history of slot machines.
Then Fey made the Operator Bell Slot machine. This slot features the famous fruit design, and became the standard for slot machine aesthetics. The history of slot machines was changed forever. As anti-slot machine sentiments began to raise, Fey had to be clever, and he designed many machines to work like vending machines. This would later be to the bane of owners of vending machines, as the public often confused the two, and police capitalized upon this when they needed good press. The Bell-Fruit Gum Company, reputed to have stolen a slot machine from Fey, was the first to mass produce machines that dispensed gum for every pull in order to mask the nature of the slot machine. This is where the BAR symbol comes from. It was an effort to market their gum.
The anti-gambling movement, piggybacked on the temperance movement proved to be trouble for Fey. Slot machines be came illegal in San Francisco in 1909 and in Nevada a year later. By 1911, they were banned by the state of California. By the thirties, it was politically popular to be anti-gambling, and especially anti- slot machine.
New York Mayor Fiorello LaGuardia had a photo op on barge dumping New York City's machines at sea. Most of the machines weren't even slot machines, and were nothing more than common vending machines. The city had confiscated many legitimate vending machines in order to score a public relations coup. It was a black day in the history of slot machines. (info from Slots.cd) (photo from earlyvegas.com)
Then came Charles Fey. He made the first slot, the Liberty Bell, in his basement. The slot machines didn't get widespread success until years later when they were put in the Flamingo Hilton hotel in the Las Vegas strip.
Fey's first slot machine was different than anything made today. It was over 100 pounds of cast iron, and lacked the fruit symbols common associated with slots. Instead it had star, horseshoes, and suits from playing cards, like diamonds and spades. The Liberty Bell gave a fifty cent payout to winners, which was substantial in its day. The Liberty Belle Saloon and Restaurant in Reno still has the very first Liberty Bell designed by Fey. The establishment is owned by his grandchildren, who preserve Fey's legacy in the history of slot machines.
Then Fey made the Operator Bell Slot machine. This slot features the famous fruit design, and became the standard for slot machine aesthetics. The history of slot machines was changed forever. As anti-slot machine sentiments began to raise, Fey had to be clever, and he designed many machines to work like vending machines. This would later be to the bane of owners of vending machines, as the public often confused the two, and police capitalized upon this when they needed good press. The Bell-Fruit Gum Company, reputed to have stolen a slot machine from Fey, was the first to mass produce machines that dispensed gum for every pull in order to mask the nature of the slot machine. This is where the BAR symbol comes from. It was an effort to market their gum.
The anti-gambling movement, piggybacked on the temperance movement proved to be trouble for Fey. Slot machines be came illegal in San Francisco in 1909 and in Nevada a year later. By 1911, they were banned by the state of California. By the thirties, it was politically popular to be anti-gambling, and especially anti- slot machine.
New York Mayor Fiorello LaGuardia had a photo op on barge dumping New York City's machines at sea. Most of the machines weren't even slot machines, and were nothing more than common vending machines. The city had confiscated many legitimate vending machines in order to score a public relations coup. It was a black day in the history of slot machines. (info from Slots.cd) (photo from earlyvegas.com)
Wednesday, June 20, 2007
2003: last COMDEX
COMDEX (Computer Dealers Expo) was a huge trade show held mostly in Las Vegas from 1979 to 2003. It was one of the largest trade shows in the world, first held in 1979 at the MGM Grand, with 167 exhibitors and 3904 attendees.
Originally open only to those directly involved in the computer industry, COMDEX was the one show where all levels of manufacturers and developers of computers, peripherals, software, components and accessories came in direct contact with retailers, consultants, and their competitors.
Colloquially known as "Geek Week," COMDEX evolved into a major technical convention, with the industry making major product announcements and releases there. Numerous small companies from around the world rose to prominence following appearance at COMDEX, and industry leaders sought opportunities to make keynote addresses. While a few keynotes were used as product promotions, (including those by Microsoft head Bill Gates, most discussed the computer industry, history, trends and future potential. Commercial acceptance of the Linux family of operating systems got a major boost following a 1999 keynote appearance by its creator, Linus Torvalds.
In the late 1980s, COMDEX was opened to the public, causing an explosion in attendance, but also a dilution of COMDEX's impact on the industry and a loss of the focus which had made the show a "must-attend" event. Retailers and consultants complained that "leading edge" customers, upon whom they relied for early adoption of new technology, were buying products at "show specials" and then expecting the dealers to support those products.
At the same time, costs of attendance ballooned with the number of attendees. Hotels as far away as Primm, 45 miles distant at the California state line, were packed even when charging several times their regular rates. Reservations for closer rooms for the following year's show were often sold out during the current show at up to several thousand dollars per night.
Hotels justified the higher prices by noting that, while COMDEX attendees saturated lodging facilities, on average they spent less time (and money) in the casinos which were the hotels' lifeblood.
After the Spring 1981 show in New York and 1982 in Atlantic City, COMDEX began regular spring shows in Atlanta from 1983 through 1988 (including a show combined with the Consumer Electronics Show that I attended). Then it started alternating sites almost every other year between Atlanta and Chicago, though the show spent two straight years in Chicago because of preparations in Atlanta for the 1996 Summer Olympic Games.
Following COMDEX Fall 1999, organizers made major changes to their criteria for admission of media, rejecting nearly all but those who were on editorial assignment from a handful of "acknowledged" trade papers. Though offered regular "public" attendance, this left hundreds of regular, long-standing press attendees from magazines and newspapers around the world with bad feelings toward the show. As press credentials were necessary to gain the level of access necessary to make the expensive trip worthwhile, most refused to go and many told vendors that they would disregard product announcements made at or in relation to COMDEX.
When other computer hardware exhibitions such as CeBIT in Germany and COMPUTEX in Taiwan continued to expand, the runaway costs and decline in quality of COMDEX had negative impacts. In addition, the annual Consumer Electronics Show in the US had gained importance, and many exhibitors determined that CES was the more cost-effective show. In 2000, major companies such as IBM, Apple Computer, and Compaq (now merged with Hewlett-Packard) decided to discontinue their involvement with COMDEX to allocate the resources more efficiently. Attendance dropepd after the September 11,2001 terrorist atacks.
In June 2004, COMDEX officially postponed the 2004 exhibition in Las Vegas due to lack of heavyweight participants. COMDEX was cancelled for 2005 and 2006, and its future status is uncertain.
Comdex was started by Las Vegas hotel magnate and philanthropist Sheldon Adelson, who sold it to Japanese technology conglomerate Softbank Corp. in 1995 for $860 million. In 2003, Softbank sold it to MediaLive, which apparently used the name Key3Media and went bust. In 2006, MediaLive was sold to CMP Media, which holds trade shows and publishes Information Week and other trade publications. A COMDEX.com website is currently operated by CMP, which said it hopes to bring back the COMDEX show. (info from Wikipedia and other sources) (photo from BBC)
Originally open only to those directly involved in the computer industry, COMDEX was the one show where all levels of manufacturers and developers of computers, peripherals, software, components and accessories came in direct contact with retailers, consultants, and their competitors.
Colloquially known as "Geek Week," COMDEX evolved into a major technical convention, with the industry making major product announcements and releases there. Numerous small companies from around the world rose to prominence following appearance at COMDEX, and industry leaders sought opportunities to make keynote addresses. While a few keynotes were used as product promotions, (including those by Microsoft head Bill Gates, most discussed the computer industry, history, trends and future potential. Commercial acceptance of the Linux family of operating systems got a major boost following a 1999 keynote appearance by its creator, Linus Torvalds.
In the late 1980s, COMDEX was opened to the public, causing an explosion in attendance, but also a dilution of COMDEX's impact on the industry and a loss of the focus which had made the show a "must-attend" event. Retailers and consultants complained that "leading edge" customers, upon whom they relied for early adoption of new technology, were buying products at "show specials" and then expecting the dealers to support those products.
At the same time, costs of attendance ballooned with the number of attendees. Hotels as far away as Primm, 45 miles distant at the California state line, were packed even when charging several times their regular rates. Reservations for closer rooms for the following year's show were often sold out during the current show at up to several thousand dollars per night.
Hotels justified the higher prices by noting that, while COMDEX attendees saturated lodging facilities, on average they spent less time (and money) in the casinos which were the hotels' lifeblood.
After the Spring 1981 show in New York and 1982 in Atlantic City, COMDEX began regular spring shows in Atlanta from 1983 through 1988 (including a show combined with the Consumer Electronics Show that I attended). Then it started alternating sites almost every other year between Atlanta and Chicago, though the show spent two straight years in Chicago because of preparations in Atlanta for the 1996 Summer Olympic Games.
Following COMDEX Fall 1999, organizers made major changes to their criteria for admission of media, rejecting nearly all but those who were on editorial assignment from a handful of "acknowledged" trade papers. Though offered regular "public" attendance, this left hundreds of regular, long-standing press attendees from magazines and newspapers around the world with bad feelings toward the show. As press credentials were necessary to gain the level of access necessary to make the expensive trip worthwhile, most refused to go and many told vendors that they would disregard product announcements made at or in relation to COMDEX.
When other computer hardware exhibitions such as CeBIT in Germany and COMPUTEX in Taiwan continued to expand, the runaway costs and decline in quality of COMDEX had negative impacts. In addition, the annual Consumer Electronics Show in the US had gained importance, and many exhibitors determined that CES was the more cost-effective show. In 2000, major companies such as IBM, Apple Computer, and Compaq (now merged with Hewlett-Packard) decided to discontinue their involvement with COMDEX to allocate the resources more efficiently. Attendance dropepd after the September 11,2001 terrorist atacks.
In June 2004, COMDEX officially postponed the 2004 exhibition in Las Vegas due to lack of heavyweight participants. COMDEX was cancelled for 2005 and 2006, and its future status is uncertain.
Comdex was started by Las Vegas hotel magnate and philanthropist Sheldon Adelson, who sold it to Japanese technology conglomerate Softbank Corp. in 1995 for $860 million. In 2003, Softbank sold it to MediaLive, which apparently used the name Key3Media and went bust. In 2006, MediaLive was sold to CMP Media, which holds trade shows and publishes Information Week and other trade publications. A COMDEX.com website is currently operated by CMP, which said it hopes to bring back the COMDEX show. (info from Wikipedia and other sources) (photo from BBC)
Tuesday, June 19, 2007
1868: first Swiss wristwatch
Patek Philippe is a prestigious name in Swiss watchmaking, that combines nearly two centuries of experience and tradition, with continued innovation. The company has been granted more than 70 patents.
Patek Philippe is the only company that crafts all of its mechanical movements according to the strict specifications of the Geneva Seal. The Geneva Seal has been granted since 1886 to locally crafted mechanical movements which comply with 12 criteria for micromechanical engineering, hand finishing, assembly and precision time regulation.
In 1839 two Polish immigrants to Switzerland, Antoine Norbert de Patek (Salesman) and François Czapek (Watchmaker) joined forces to found Patek, Czapek & Cie. In 1844 Patek met French watchmaker Adrien Philippe in Paris where Philippe introduced his innovative winding and setting by the watch crown. In 1845 when Czapek decided to leave the company, the name changed to Patek & Cie. In 1851 when Philippe became part of the company, its name was changed again, to Patek Philippe & Cie. It changed in 1901 to become Ancienne Manufacture d’Horlogerie Patek Philippe & Cie, S.A.
In 1932, the company was purchased by brothers Charles and Jean Stern, and since then, Patek Philippe S.A. has remained a family owned company, with 3rd and 4th generation Sterns in charge: Philippe Stern, President and Thierry Stern, his son, Vice-President.
Back in 1868, Patek Philippe created the first Swiss wristwatch, made for Countess Koscowicz of Hungary. (info from Patek Philippe)
Patek Philippe is the only company that crafts all of its mechanical movements according to the strict specifications of the Geneva Seal. The Geneva Seal has been granted since 1886 to locally crafted mechanical movements which comply with 12 criteria for micromechanical engineering, hand finishing, assembly and precision time regulation.
In 1839 two Polish immigrants to Switzerland, Antoine Norbert de Patek (Salesman) and François Czapek (Watchmaker) joined forces to found Patek, Czapek & Cie. In 1844 Patek met French watchmaker Adrien Philippe in Paris where Philippe introduced his innovative winding and setting by the watch crown. In 1845 when Czapek decided to leave the company, the name changed to Patek & Cie. In 1851 when Philippe became part of the company, its name was changed again, to Patek Philippe & Cie. It changed in 1901 to become Ancienne Manufacture d’Horlogerie Patek Philippe & Cie, S.A.
In 1932, the company was purchased by brothers Charles and Jean Stern, and since then, Patek Philippe S.A. has remained a family owned company, with 3rd and 4th generation Sterns in charge: Philippe Stern, President and Thierry Stern, his son, Vice-President.
Back in 1868, Patek Philippe created the first Swiss wristwatch, made for Countess Koscowicz of Hungary. (info from Patek Philippe)
Monday, June 18, 2007
2007: first Argentinian wins US Open
Holding off the number one and number three players in the world, Argentinian Angel Cabrera won the 107th US Open golf tournament in Oakmount, Pennsylvania on June 17.
Cabrera is the first Argentinian to win the US Open in its 107 year history. In fact, this is only the second time an Argentinian has won a Major. The first was the British Open in 1967.
Cabrera also joins the list of those whose first PGA tour win is the US Open.
Tiger Woods finished tied for second with Jim Furyk after attempting a 20 foot birdie putt on the 18th that would have forced a playoff. He wasn’t able to beat the toughest hole on the golf course and settled for a par and the tie for second.
Furyk finished at +6, and becomes the first player to finish second in consecutive opens. (info from Bodog Beat, photo from The Associated Press)
Cabrera is the first Argentinian to win the US Open in its 107 year history. In fact, this is only the second time an Argentinian has won a Major. The first was the British Open in 1967.
Cabrera also joins the list of those whose first PGA tour win is the US Open.
Tiger Woods finished tied for second with Jim Furyk after attempting a 20 foot birdie putt on the 18th that would have forced a playoff. He wasn’t able to beat the toughest hole on the golf course and settled for a par and the tie for second.
Furyk finished at +6, and becomes the first player to finish second in consecutive opens. (info from Bodog Beat, photo from The Associated Press)
Friday, June 15, 2007
1938: first effective sunscreen (suntan lotion)
The ancient Greeks used olive oil as a type of sunscreen, but it did not work very well. An effective sunscreen was developed in 1938 by Swiss chemistry student Franz Greiter, after he had severely burnt himself during an ascent of the Piz Buin on the border between Switzerland and Austria. He named his product, which he had developed in a small laboratory in his parents' home, Gletscher Creme (Glacier Cream). Still existing examples of the 'Glacier Cream' have shown to have a SPF of 2 and thus could be classed as an effective sunscreen.
In 1944, during World War II, many soldiers were getting serious sunburn. A pharmacist named Benjamin Greene decided to create something that would save the soldiers from the sun’s harmful rays. In his wife’s oven, he created a sticky, red substance which he called "red vet pet" (red veterinary petrolatum), which worked primarily by physically blocking the sun's rays with a thick petroleum-based product similar to Vaseline. Greene tested it on his own bald head. It did not work nearly as well as modern sunscreens, but it was a start.
Sunscreen has come a long way since its initial days. Modern products have much higher protection factors than Greene's sunscreen, and modern products can also be water- and sweat-resistant. But there are also negative effects. Some people rely too much on the product and do not understand the limitations of the sun protection factor (SPF); they assume that buying anything over SPF 30 will automatically prevent them getting burnt no matter how long they can stay in the sun. Too much sunbathing is one of the major causes of skin cancer across the world.
In 1944, Green used his invention as the basis for Coppertone Suntan Cream - the very first consumer sunscreen product. In fact, this mixture of cocoa butter and jasmine was concocted on his wife's stove and tested on his own bald head.
The company became famous in 1959 when it introduced the Coppertone girl, a 3-year-old girl in pigtails named Cheri Brand. Cheri had posed for a photo in her backyard and soon became Little Miss Coppertone, a symbol of summer in the long-running Coppertone ad campaign whose famous slogans proclaim "Don't be a Paleface!" and "Tan - Don't Burn."
The Coppertone ad campaign was based on drawings created by Cheri's mother, Joyce Ballantyne Brand, a commercial artist (who also drew the Pampers baby in 1977). She was paid $2,500 for the artwork based on her little girl.
Cheri Brand once said "Everybody has their baby pictures in their family album with their diapers falling off, I just happened to have mine on a billboard" (and a few million bottles of suntan lotion).
The dog depicted in the ad pulling down the pants of the little girl to reveal her tan-line was based on a neighbor's cocker spaniel. A popular belief that Jodie Foster was the original Coppertone girl is misleading. Foster did, however, get her start in showbiz for a Coppertone ad in 1965. She was three years old at the time and appeared in the ad on a boat with her family.
Competitor Hawaiian Tan goes back to 1964 with the founding of Liana of Waikiki by Dr. Horst Baumgartner in Honolulu. As a chemist, Dr. Baumgartner was not satisfied with the suntan lotions then on the market, and he formulated Hawaiian Tan, which was introduced in the Hawaiian Pavillion at the 1964 World's Fair in New York. Soon Hawaiian Tan became increasingly popular with tourists in Hawaii, and its popularity spread to the US mainland.
Following its start in Hawaii, the company was moved to the US Virgin Islands, and Hawaiian Tan became popular with tourists to the Caribbean. The original product line was augmented to include coconut-based Coco Sol tan accelerator oil and lotion, Coco Baby Oil, Coco Lotion hand and body moisturizers, and other after-sun skin care products. In 1978, the company moved to Californa, and was incorporated as Caribia, Inc. (info From Wikipedia, Hawaiian Tan, Coppertone, TV Acres)
In 1944, during World War II, many soldiers were getting serious sunburn. A pharmacist named Benjamin Greene decided to create something that would save the soldiers from the sun’s harmful rays. In his wife’s oven, he created a sticky, red substance which he called "red vet pet" (red veterinary petrolatum), which worked primarily by physically blocking the sun's rays with a thick petroleum-based product similar to Vaseline. Greene tested it on his own bald head. It did not work nearly as well as modern sunscreens, but it was a start.
Sunscreen has come a long way since its initial days. Modern products have much higher protection factors than Greene's sunscreen, and modern products can also be water- and sweat-resistant. But there are also negative effects. Some people rely too much on the product and do not understand the limitations of the sun protection factor (SPF); they assume that buying anything over SPF 30 will automatically prevent them getting burnt no matter how long they can stay in the sun. Too much sunbathing is one of the major causes of skin cancer across the world.
In 1944, Green used his invention as the basis for Coppertone Suntan Cream - the very first consumer sunscreen product. In fact, this mixture of cocoa butter and jasmine was concocted on his wife's stove and tested on his own bald head.
The company became famous in 1959 when it introduced the Coppertone girl, a 3-year-old girl in pigtails named Cheri Brand. Cheri had posed for a photo in her backyard and soon became Little Miss Coppertone, a symbol of summer in the long-running Coppertone ad campaign whose famous slogans proclaim "Don't be a Paleface!" and "Tan - Don't Burn."
The Coppertone ad campaign was based on drawings created by Cheri's mother, Joyce Ballantyne Brand, a commercial artist (who also drew the Pampers baby in 1977). She was paid $2,500 for the artwork based on her little girl.
Cheri Brand once said "Everybody has their baby pictures in their family album with their diapers falling off, I just happened to have mine on a billboard" (and a few million bottles of suntan lotion).
The dog depicted in the ad pulling down the pants of the little girl to reveal her tan-line was based on a neighbor's cocker spaniel. A popular belief that Jodie Foster was the original Coppertone girl is misleading. Foster did, however, get her start in showbiz for a Coppertone ad in 1965. She was three years old at the time and appeared in the ad on a boat with her family.
Competitor Hawaiian Tan goes back to 1964 with the founding of Liana of Waikiki by Dr. Horst Baumgartner in Honolulu. As a chemist, Dr. Baumgartner was not satisfied with the suntan lotions then on the market, and he formulated Hawaiian Tan, which was introduced in the Hawaiian Pavillion at the 1964 World's Fair in New York. Soon Hawaiian Tan became increasingly popular with tourists in Hawaii, and its popularity spread to the US mainland.
Following its start in Hawaii, the company was moved to the US Virgin Islands, and Hawaiian Tan became popular with tourists to the Caribbean. The original product line was augmented to include coconut-based Coco Sol tan accelerator oil and lotion, Coco Baby Oil, Coco Lotion hand and body moisturizers, and other after-sun skin care products. In 1978, the company moved to Californa, and was incorporated as Caribia, Inc. (info From Wikipedia, Hawaiian Tan, Coppertone, TV Acres)
Thursday, June 14, 2007
1939: first American cars with automatic transmission
The automatic transmission was based on technology first developed in the early 1900s by German manufacturers of marine engines, but it was not adapted for automobiles for several decades.
In 1938, General Motors developed the first line of cars with automatic transmissions -- Oldsmobiles that offered "Hydra-Matic drive." The cars were first produced in the Fall of 1939, as 1940 models. In 1941, Chrysler followed suit and introduced three different cars that offered their version of automatic drive, "Vacamatic" (later called "Fluid Drive"). Automatic transmission was a fairly common option on most American cars by 1948.
The Olds tranny provided true clutchless driving with four forward speeds. Its fluid coupling between engine and transmission eliminated the clutch and its associated foot work. Olds made the breakthrough Hydra-Matic available on all models for only $57.
The Chrysler Vacamatic was really only semi-automatic. It featured four speeds and would switch automatically between the two lower or two higher gears, but the driver needed to use the clutch to switch from a lower gear to a higher gear or vice versa. (info from Yahoo and Wikipedia, photo from Hemmings)
In 1938, General Motors developed the first line of cars with automatic transmissions -- Oldsmobiles that offered "Hydra-Matic drive." The cars were first produced in the Fall of 1939, as 1940 models. In 1941, Chrysler followed suit and introduced three different cars that offered their version of automatic drive, "Vacamatic" (later called "Fluid Drive"). Automatic transmission was a fairly common option on most American cars by 1948.
The Olds tranny provided true clutchless driving with four forward speeds. Its fluid coupling between engine and transmission eliminated the clutch and its associated foot work. Olds made the breakthrough Hydra-Matic available on all models for only $57.
The Chrysler Vacamatic was really only semi-automatic. It featured four speeds and would switch automatically between the two lower or two higher gears, but the driver needed to use the clutch to switch from a lower gear to a higher gear or vice versa. (info from Yahoo and Wikipedia, photo from Hemmings)
Wednesday, June 13, 2007
2007: death of "Mr. Wizard"
"Mr. Wizard" Don Herbert, who explained the world of science to millions of young baby boomers on television in the 1950s and 60s and for a later generation on the Nickelodeon cable TV channel in the 1980s, died Tuesday.
He was 89, and died at his home in Bell Canyon, CA after a long battle with multiple myeloma.
A low-key, avuncular presence who wore a white dress shirt with the sleeves rolled up and a tie, Herbert launched his weekly half-hour science show for children on NBC in 1951. Watch Mr. Wizard was broadcast live from Chicago on Saturdays the first few years and then from New York. It ran for 14 years.
Herbert used basic experiments to teach scientific principles to his TV audience via an in-studio guest boy or girl who assisted in the experiments.
"I was a grade school kid in the '50s and watched 'Mr. Wizard' Saturday mornings and was just glued to the television," said Nikosey, president of Mr. Wizard Studios, which sells Mr. Herbert's science books and old TV shows on DVD.
"The show just heightened my curiosity about science and the way things worked," said Nikosey. "I learned an awful lot from him, as did millions of other kids."
By 1955, there were about 5,000 Mr. Wizard Science Clubs across the US, with more than 100,000 members.
In explaining how he brought a sense of wonder to elementary scientific experiments, Herbert said that he "would perform the trick, as it were, to hook the kids, and then explain the science later."
He said, "A lot of scientists criticized us for using the words 'magic' and 'mystery' in the show's subtitle, but they came around eventually."
Watch Mr. Wizard garnered numerous honors, including a Peabody Award and the Thomas Alva Edison Foundation Award for Best Science TV Program for Youth.
Herbert had an important and lasting impact.
"Over the years, Don has been personally responsible for more people going into the sciences than any other single person in this country," George Tressel, a National Science Foundation official, said in 1989. "I fully realize the number is virtually endless when I talk to scientists. They all say that Mr. Wizard taught them to think."
After Watch Mr. Wizard ended its 14-year-run in 1965, Herbert showed up frequently on talk shows, including The Tonight Show and Late Night With David Letterman. Watch Mr. Wizard was revived on NBC in 1971 for a season, and Mr. Wizard's World ran on Nickelodeon from 1983 to 1990.
Born in 1917, in Waconia, MN, Herbert grew up in La Crosse, WI, and graduated from LaCrosse State Teachers College in 1940. In 1942, he volunteered for the Army Air Forces. As a B-24 bomber pilot, he flew 56 missions over Italy, Germany and Yugoslavia and received the Distinguished Flying Cross and the Air Medal with three oak-leaf clusters.
(info from The Seattle Times, photo from The New York Times) CLICK for Mr. Wizard's website. You can order DVDs of the old shows.
He was 89, and died at his home in Bell Canyon, CA after a long battle with multiple myeloma.
A low-key, avuncular presence who wore a white dress shirt with the sleeves rolled up and a tie, Herbert launched his weekly half-hour science show for children on NBC in 1951. Watch Mr. Wizard was broadcast live from Chicago on Saturdays the first few years and then from New York. It ran for 14 years.
Herbert used basic experiments to teach scientific principles to his TV audience via an in-studio guest boy or girl who assisted in the experiments.
"I was a grade school kid in the '50s and watched 'Mr. Wizard' Saturday mornings and was just glued to the television," said Nikosey, president of Mr. Wizard Studios, which sells Mr. Herbert's science books and old TV shows on DVD.
"The show just heightened my curiosity about science and the way things worked," said Nikosey. "I learned an awful lot from him, as did millions of other kids."
By 1955, there were about 5,000 Mr. Wizard Science Clubs across the US, with more than 100,000 members.
In explaining how he brought a sense of wonder to elementary scientific experiments, Herbert said that he "would perform the trick, as it were, to hook the kids, and then explain the science later."
He said, "A lot of scientists criticized us for using the words 'magic' and 'mystery' in the show's subtitle, but they came around eventually."
Watch Mr. Wizard garnered numerous honors, including a Peabody Award and the Thomas Alva Edison Foundation Award for Best Science TV Program for Youth.
Herbert had an important and lasting impact.
"Over the years, Don has been personally responsible for more people going into the sciences than any other single person in this country," George Tressel, a National Science Foundation official, said in 1989. "I fully realize the number is virtually endless when I talk to scientists. They all say that Mr. Wizard taught them to think."
After Watch Mr. Wizard ended its 14-year-run in 1965, Herbert showed up frequently on talk shows, including The Tonight Show and Late Night With David Letterman. Watch Mr. Wizard was revived on NBC in 1971 for a season, and Mr. Wizard's World ran on Nickelodeon from 1983 to 1990.
Born in 1917, in Waconia, MN, Herbert grew up in La Crosse, WI, and graduated from LaCrosse State Teachers College in 1940. In 1942, he volunteered for the Army Air Forces. As a B-24 bomber pilot, he flew 56 missions over Italy, Germany and Yugoslavia and received the Distinguished Flying Cross and the Air Medal with three oak-leaf clusters.
(info from The Seattle Times, photo from The New York Times) CLICK for Mr. Wizard's website. You can order DVDs of the old shows.
Tuesday, June 12, 2007
1952: first TV "anchor"
In 1950, Walter Cronkite joined CBS News in its young and growing television division, recruited by Edward R. Murrow, who had previously tried to hire Cronkite from the United Press during WW2. Cronkite began work at the CBS station in Washington, DC.
On July 7, 1952, the term "anchor" was coined to describe Cronkite's role at both the Democratic and Republican National Conventions, which marked the first nationally-televised convention coverage. Cronkite anchored the network's coverage of the 1952 presidential election as well as later conventions, until in 1964 he was temporarily replaced by the team of Robert Trout and Roger Mudd. This proved to be a mistake, and Cronkite was returned to the anchor chair for future political conventions.
From 1953 to 1957, Cronkite hosted the CBS program You Are There, which reenacted historical events, using the format of a news report. His famous last line for these programs was: "What sort of day was it? A day like all days, filled with those events that alter and illuminate our times... and you were there." He also hosted The Twentieth Century, a documentary series about important historical events of the century which was made up almost exclusively of newsreel footage and interviews. It became a long-running hit.
Cronkite succeeded Douglas Edwards as anchorman of the CBS Evening News on April 16, 1962, a job in which he became an American icon. The program expanded from 15 to 30 minutes in 1963, making Cronkite the anchor of American network television's first nightly half-hour news program.
Cronkite announced that he intended to retire from The CBS Evening News on February 14, 1980; at the time, CBS had a policy in place that called for mandatory retirement by age 65. Although sometimes compared to a father figure or an uncle figure, in an interview about his retirement he described himself as being more like a "comfortable old shoe" to his audience. His last day in the anchor chair at the CBS Evening News was on March 6, 1981; he was succeeded the following Monday by Dan Rather.
Cronkite's farewell statement: “ This is my last broadcast as the anchorman for The CBS Evening News. For me, it's a moment for which I long have planned, but which, nevertheless, comes with some sadness. For almost two decades, after all, we've been meeting like this in the evenings, and I'll miss that. But to those who have made anything of this departure, I'm afraid it made too much. This is but a transition, a passing of the baton. A great broadcaster and gentleman, Doug Edwards, preceded me in this job, and another, Dan Rather, will follow. And anyway, the person who sits here is but the most conspicuous member of a superb team of journalists; writers, reporters, editors, producers, and none of that will change. Furthermore, I'm not even going away! I'll be back from time to time with special news reports and documentaries, and, beginning in June, every week, with our science program, Universe. Old anchormen, you see, don't fade away; they just keep coming back for more. And that's the way it is: Friday, March 6, 1981. I'll be away on assignment, and Dan Rather will be sitting in here for the next few years. Good night."
When Dan Rather left the evening news show in 2004 after 23 years as anchor, Cronkite said that if he (Cronkite) knew he would live so long, he would not have let Rather replace him at age 65.
In May 2007, Cronkite was honored with a CBS special for his 90th birthday. (info from Wikipedia and other sources)
On July 7, 1952, the term "anchor" was coined to describe Cronkite's role at both the Democratic and Republican National Conventions, which marked the first nationally-televised convention coverage. Cronkite anchored the network's coverage of the 1952 presidential election as well as later conventions, until in 1964 he was temporarily replaced by the team of Robert Trout and Roger Mudd. This proved to be a mistake, and Cronkite was returned to the anchor chair for future political conventions.
From 1953 to 1957, Cronkite hosted the CBS program You Are There, which reenacted historical events, using the format of a news report. His famous last line for these programs was: "What sort of day was it? A day like all days, filled with those events that alter and illuminate our times... and you were there." He also hosted The Twentieth Century, a documentary series about important historical events of the century which was made up almost exclusively of newsreel footage and interviews. It became a long-running hit.
Cronkite succeeded Douglas Edwards as anchorman of the CBS Evening News on April 16, 1962, a job in which he became an American icon. The program expanded from 15 to 30 minutes in 1963, making Cronkite the anchor of American network television's first nightly half-hour news program.
Cronkite announced that he intended to retire from The CBS Evening News on February 14, 1980; at the time, CBS had a policy in place that called for mandatory retirement by age 65. Although sometimes compared to a father figure or an uncle figure, in an interview about his retirement he described himself as being more like a "comfortable old shoe" to his audience. His last day in the anchor chair at the CBS Evening News was on March 6, 1981; he was succeeded the following Monday by Dan Rather.
Cronkite's farewell statement: “ This is my last broadcast as the anchorman for The CBS Evening News. For me, it's a moment for which I long have planned, but which, nevertheless, comes with some sadness. For almost two decades, after all, we've been meeting like this in the evenings, and I'll miss that. But to those who have made anything of this departure, I'm afraid it made too much. This is but a transition, a passing of the baton. A great broadcaster and gentleman, Doug Edwards, preceded me in this job, and another, Dan Rather, will follow. And anyway, the person who sits here is but the most conspicuous member of a superb team of journalists; writers, reporters, editors, producers, and none of that will change. Furthermore, I'm not even going away! I'll be back from time to time with special news reports and documentaries, and, beginning in June, every week, with our science program, Universe. Old anchormen, you see, don't fade away; they just keep coming back for more. And that's the way it is: Friday, March 6, 1981. I'll be away on assignment, and Dan Rather will be sitting in here for the next few years. Good night."
When Dan Rather left the evening news show in 2004 after 23 years as anchor, Cronkite said that if he (Cronkite) knew he would live so long, he would not have let Rather replace him at age 65.
In May 2007, Cronkite was honored with a CBS special for his 90th birthday. (info from Wikipedia and other sources)
Monday, June 11, 2007
1973: last American president to wear a top hat at inauguration
The first top hats were made with felt, mostly from beaver fur. Later, they would be made of silk. A popular version, particularly in the United States in the 19th century, was the stovepipe hat, which was popularized by Abraham Lincoln. Unlike many top-hats, this version was straight, like piping, and was not wider at the top and bottom. Often they were taller than the typical top-hat.
Later on, top hats were sometimes given an internal hinged frame, making them collapsible. Such hats are often called the "opera hat" or "Gibbus," though the term can also be synonymous with any top hat, or any tall formal men's hat. In the 1920s they were also often called "high hats".
In the latter half of the 19th century, the top hat gradually fell out of fashion, with the middle classes adopting bowler hats and soft felt hats such as fedoras, which were more convenient for city life, as well as being suitable for mass production.
In comparison, a top hat needed to be handmade by a skilled hatter, with few young people willing to take up what was obviously a dying trade. The top hat became associated with the upper class, becoming a target for satirists and social critics. By the end of World War I it had become a rarity in everyday life. It continued to be used for formal wear, with a morning suit in the daytime and with evening clothes (tuxedo or tailcoat) until the late 1930s. (The top hat is featured as one of the original tokens in the board game Monopoly.)
The top hat persisted in certain areas, such as politics and international diplomacy, for several more years. In the newly-formed Soviet Union, there was a fierce debate as to whether its diplomats should follow the international conventions and wear a top hat, with the pro-toppers winning the vote by a large majority.
The last American president to wear a top hat to an inauguration was Richard Nixon. Gerald Ford was not inaugurated at the Capitol and Jimmy Carter abolished the use of morning dress for inaugurations. It was reinstated, minus a top hat, by Ronald Reagan but not worn by any later presidents to date. (info from Wikipedia)
Later on, top hats were sometimes given an internal hinged frame, making them collapsible. Such hats are often called the "opera hat" or "Gibbus," though the term can also be synonymous with any top hat, or any tall formal men's hat. In the 1920s they were also often called "high hats".
In the latter half of the 19th century, the top hat gradually fell out of fashion, with the middle classes adopting bowler hats and soft felt hats such as fedoras, which were more convenient for city life, as well as being suitable for mass production.
In comparison, a top hat needed to be handmade by a skilled hatter, with few young people willing to take up what was obviously a dying trade. The top hat became associated with the upper class, becoming a target for satirists and social critics. By the end of World War I it had become a rarity in everyday life. It continued to be used for formal wear, with a morning suit in the daytime and with evening clothes (tuxedo or tailcoat) until the late 1930s. (The top hat is featured as one of the original tokens in the board game Monopoly.)
The top hat persisted in certain areas, such as politics and international diplomacy, for several more years. In the newly-formed Soviet Union, there was a fierce debate as to whether its diplomats should follow the international conventions and wear a top hat, with the pro-toppers winning the vote by a large majority.
The last American president to wear a top hat to an inauguration was Richard Nixon. Gerald Ford was not inaugurated at the Capitol and Jimmy Carter abolished the use of morning dress for inaugurations. It was reinstated, minus a top hat, by Ronald Reagan but not worn by any later presidents to date. (info from Wikipedia)
Friday, June 8, 2007
2006: first country ends analog television
The Netherlands ended transmission of "free to air" analog television in December 2006, becoming the first nation to switch completely to digital signals.
Few Dutch consumers noticed, because the overwhelming majority get TV via cable. Only around 74,000 households relied primarily on the old-fashioned TV antennas in the country of 16 million, although 220,000 people had an "occasional use" set somewhere such as in a vacation house, camper or boat.
Digital television has several advantages over traditional analog TV, the most significant being that digital channels take up less bandwidth space. This means that digital broadcasters can provide more digital channels in the same space, provide High-Definition digital service, or provide other non-television services such as pay-multimedia services or interactive services. Digital television also permits special services such as multicasting (more than one program on the same channel) and electronic program guides. The sale of non-television services may provide an additional revenue source. As well, digital television often has a superior image, improved audio quality, and better reception than analog.
The bandwidth formerly used by analog has been licensed through 2017 by former telecommunications monopoly Royal KPN NV, which will use it to broadcast digital television.
Under its agreement with the government, KPN bore the cost of building digital broadcasting antennas and must continue to broadcast three state-supported channels and several regional public broadcasters free of charge. In return, it can use the rest of the open bandwidth to charge around $18.50 a month for a package of other channels that is comparable with cable.
Whether customers opt for just the free channels or a full cable-like package, they must first buy a tuner to decode the new "digital terrestrial" signals, available for around $66.50.
In the United States, television broadcasts will be exclusively digital as of February 17, 2009. The government will subsidize the purchase of digital adapters for use with older analog televisions. It is expected that the adapters will cost between $60 and $100, and each family can get two $40 payments from the Feds. The Feds are supposed to get the money back when they sell the transmission channels that had been used for analog TV, to cellphone carriers and other companies. (some info from The Associated Press and Wikipedia)
Few Dutch consumers noticed, because the overwhelming majority get TV via cable. Only around 74,000 households relied primarily on the old-fashioned TV antennas in the country of 16 million, although 220,000 people had an "occasional use" set somewhere such as in a vacation house, camper or boat.
Digital television has several advantages over traditional analog TV, the most significant being that digital channels take up less bandwidth space. This means that digital broadcasters can provide more digital channels in the same space, provide High-Definition digital service, or provide other non-television services such as pay-multimedia services or interactive services. Digital television also permits special services such as multicasting (more than one program on the same channel) and electronic program guides. The sale of non-television services may provide an additional revenue source. As well, digital television often has a superior image, improved audio quality, and better reception than analog.
The bandwidth formerly used by analog has been licensed through 2017 by former telecommunications monopoly Royal KPN NV, which will use it to broadcast digital television.
Under its agreement with the government, KPN bore the cost of building digital broadcasting antennas and must continue to broadcast three state-supported channels and several regional public broadcasters free of charge. In return, it can use the rest of the open bandwidth to charge around $18.50 a month for a package of other channels that is comparable with cable.
Whether customers opt for just the free channels or a full cable-like package, they must first buy a tuner to decode the new "digital terrestrial" signals, available for around $66.50.
In the United States, television broadcasts will be exclusively digital as of February 17, 2009. The government will subsidize the purchase of digital adapters for use with older analog televisions. It is expected that the adapters will cost between $60 and $100, and each family can get two $40 payments from the Feds. The Feds are supposed to get the money back when they sell the transmission channels that had been used for analog TV, to cellphone carriers and other companies. (some info from The Associated Press and Wikipedia)
Thursday, June 7, 2007
1933: first drive-in theater
In 1932, New Jersey, chemical company magnate Richard M. Hollingshead conducted outdoor theater tests in his driveway in Camden. After nailing a screen to trees in his backyard, he set a 1928 Kodak projector on the hood of his car and put a radio behind the screen, testing different sound levels with his car windows down and up. Blocks under vehicles in the driveway enabled him to determine the size and spacing of ramps so all automobiles could have a clear view of the screen.
Following these experiments, he applied for a patent of his invention, and he was given U.S. Patent 1,909,537 on May 16, 1933. Seventeen years later, that patent was declared invalid by the Delaware District Court.
Hollingshead's drive-in opened in New Jersey June 6, 1933 in Pennsauken, a short distance from Cooper River Park. He advertised his drive-in theater by saying, "The whole family is welcome, regardless of how noisy the children are."
It only operated for three years, but during that time the concept caught on in other states. The 1934, opening of Shankweiler's Auto Park in Orefield, Pennsylvania, was followed by Galveston's Drive-In Short Reel Theater (1934), the Pico in Los Angeles (1934) and the Weymouth Drive-In Theatre in Weymouth, Massachusetts (1936). In 1937, three more opened in Ohio, Massachusetts and Rhode Island, with another twelve during 1938 and 1939 in California, Florida, Maine, Maryland, Massachusetts, Michigan, New York, Texas and Virginia.
One reason that drive-ins were so popular is that it allowed the entire family to go to the movies and not have to hire a baby-sitter or worry that their children would disrupt the entire audience. The entire family could enjoy a movie in the privacy of their own vehicles, for the same cost that a couple would pay to see a movie in a theater.
Before World War 2, there had been approximately 100 major drive-ins nationwide, and the drive in craze began to build very strongly following the end of the war. Many GI’s had traveled the country and seen drive-ins. Enterprising businessmen realized that this segment of the population could be tapped and spend some of their earnings to enjoy themselves, a date, or an evening with the family.
The drive-in's peak popularity came in the late 1950s and early 1960s, particularly in rural areas, with some 4000 drive-ins spreading across the US. Parents with a baby could take care of their child while watching a movie, while teenagers found drive-ins ideal for dates. Revenue was more limited than regular theaters since showings can only start at twilight. There were abortive attempts to create suitable conditions for daylight viewing, such as large tent structures, but nothing viable was developed.
In the 1950s, the greater privacy afforded to patrons gave drive-ins a reputation as immoral, and they were labeled "passion pits" in the media. During the 1970s, some drive-ins changed from family fare to exploitation movies. Also, during the 1970s, some drive-ins began to show pornographic movies in less family-centered time slots to bring in extra income. This became a problem because it allowed for censored materials to be available to a wide audience, some for whom viewing was illegal. This also led to concern about the availability and uncontrollability of adult-centered media in the general public.
Teenagers with limited incomes developed an ingenious method to see drive-in movies for free: two teenagers (usually a couple) would take their car to the drive-in, and pay for two tickets. After the couple entered the theater and found a parking space, the driver would open the trunk, and other teenagers hidden inside jumped out to enjoy the "free" movies. To ensure one person was not continually stuck with paying, the ticket cost was often rotated or split among the friends.
Nearly anyone who grew up in America during the 1950s and 1960s has fond memories of drive-ins. Whether one remembers the playground as a child, going on dates as a teenager or taking the family out, drive-ins became an ingrained part of Americana. The drive-in became an equalizer in that a person’s social or economic status was irrelevant; people simply were going to the movies.
Many drive-ins devised very elaborate and sometimes quirky modes of comfort. Some drive-ins provided small propane heaters, attempting to entice their patrons to come in colder months. Some drive-ins provided a heating or air-conditioning system via underground ducts to heat or cool patrons, but due to their frequency of becoming homes for rodents, many people actually ended up with a car full of mice instead.
Audio systems varied greatly during the era of drive-ins. Some used portable speakers on trucks during the early days but this proved ineffective since the people in the front were blasted with sound while the people in the back could not adequately hear what was going being said. Finally the best solution came in the form of small speakers which could be hooked onto the sides of the vehicles. These also had issues with quality and did not provide stereo sound. Many drive-ins also had strange and seemingly useful devices such as mosquito nets and rain guards.
During their height, drive-ins used attention-grabbing gimmicks to entice even more people to become patrons. Some drive-ins installed small runways so patrons could fly in. Other drive-ins had strange and unusual attractions such as a small petting zoo or a cage of monkeys. Many of the larger drive-ins had appearances by celebrities or musical groups, and some drive-ins actually had religious services performed on their grounds on Sunday morning and evening, before the show.
Eventually, the rising value of real estate made the large property areas increasingly expensive for drive-ins to operate successfully. Widespread adoption of daylight saving time subtracted an hour from outdoor evening viewing time. These changes and the advent of color televisions, VCRs and video rentals led to a sharp decline in the popularity of drive-ins. They eventually lapsed into a quasi-novelty status with the remaining handful catering to a generally nostalgic audience, though many drive-ins continue to successfully operate in isolated areas. Many drive-in movie sites remain, as flea markets or other businesses.
In 2002, groups of dedicated individuals began to organize so-called "guerilla drive-ins" and "guerilla walk-ins" in parking lots and empty fields. Showings are often organized online, and participants meet at specified locations to watch films projected on bridge pillars or warehouses.
The best known guerilla drive-ins include the Santa Cruz Guerilla Drive-In in Santa Cruz, California, MobMov in Berkeley, California and Hollywood MobMov in Los Angeles, California, and most recently Guerilla Drive-In Victoria in Victoria, BC.
The Bell Museum of Natural History in Minneapolis, Minnesota has recently begun summer "bike-ins," inviting only pedestrians or people on bicycles onto the grounds for both live music and movies. In various Canadian cities, al-fresco movies projected on the walls of buildings or temporarily erected screens in parks operate during the Summer and cater to a pedestrian audience. (info from Wikipedia) (Kansas theater photo from www.reeldiaries.com)
Following these experiments, he applied for a patent of his invention, and he was given U.S. Patent 1,909,537 on May 16, 1933. Seventeen years later, that patent was declared invalid by the Delaware District Court.
Hollingshead's drive-in opened in New Jersey June 6, 1933 in Pennsauken, a short distance from Cooper River Park. He advertised his drive-in theater by saying, "The whole family is welcome, regardless of how noisy the children are."
It only operated for three years, but during that time the concept caught on in other states. The 1934, opening of Shankweiler's Auto Park in Orefield, Pennsylvania, was followed by Galveston's Drive-In Short Reel Theater (1934), the Pico in Los Angeles (1934) and the Weymouth Drive-In Theatre in Weymouth, Massachusetts (1936). In 1937, three more opened in Ohio, Massachusetts and Rhode Island, with another twelve during 1938 and 1939 in California, Florida, Maine, Maryland, Massachusetts, Michigan, New York, Texas and Virginia.
One reason that drive-ins were so popular is that it allowed the entire family to go to the movies and not have to hire a baby-sitter or worry that their children would disrupt the entire audience. The entire family could enjoy a movie in the privacy of their own vehicles, for the same cost that a couple would pay to see a movie in a theater.
Before World War 2, there had been approximately 100 major drive-ins nationwide, and the drive in craze began to build very strongly following the end of the war. Many GI’s had traveled the country and seen drive-ins. Enterprising businessmen realized that this segment of the population could be tapped and spend some of their earnings to enjoy themselves, a date, or an evening with the family.
The drive-in's peak popularity came in the late 1950s and early 1960s, particularly in rural areas, with some 4000 drive-ins spreading across the US. Parents with a baby could take care of their child while watching a movie, while teenagers found drive-ins ideal for dates. Revenue was more limited than regular theaters since showings can only start at twilight. There were abortive attempts to create suitable conditions for daylight viewing, such as large tent structures, but nothing viable was developed.
In the 1950s, the greater privacy afforded to patrons gave drive-ins a reputation as immoral, and they were labeled "passion pits" in the media. During the 1970s, some drive-ins changed from family fare to exploitation movies. Also, during the 1970s, some drive-ins began to show pornographic movies in less family-centered time slots to bring in extra income. This became a problem because it allowed for censored materials to be available to a wide audience, some for whom viewing was illegal. This also led to concern about the availability and uncontrollability of adult-centered media in the general public.
Teenagers with limited incomes developed an ingenious method to see drive-in movies for free: two teenagers (usually a couple) would take their car to the drive-in, and pay for two tickets. After the couple entered the theater and found a parking space, the driver would open the trunk, and other teenagers hidden inside jumped out to enjoy the "free" movies. To ensure one person was not continually stuck with paying, the ticket cost was often rotated or split among the friends.
Nearly anyone who grew up in America during the 1950s and 1960s has fond memories of drive-ins. Whether one remembers the playground as a child, going on dates as a teenager or taking the family out, drive-ins became an ingrained part of Americana. The drive-in became an equalizer in that a person’s social or economic status was irrelevant; people simply were going to the movies.
Many drive-ins devised very elaborate and sometimes quirky modes of comfort. Some drive-ins provided small propane heaters, attempting to entice their patrons to come in colder months. Some drive-ins provided a heating or air-conditioning system via underground ducts to heat or cool patrons, but due to their frequency of becoming homes for rodents, many people actually ended up with a car full of mice instead.
Audio systems varied greatly during the era of drive-ins. Some used portable speakers on trucks during the early days but this proved ineffective since the people in the front were blasted with sound while the people in the back could not adequately hear what was going being said. Finally the best solution came in the form of small speakers which could be hooked onto the sides of the vehicles. These also had issues with quality and did not provide stereo sound. Many drive-ins also had strange and seemingly useful devices such as mosquito nets and rain guards.
During their height, drive-ins used attention-grabbing gimmicks to entice even more people to become patrons. Some drive-ins installed small runways so patrons could fly in. Other drive-ins had strange and unusual attractions such as a small petting zoo or a cage of monkeys. Many of the larger drive-ins had appearances by celebrities or musical groups, and some drive-ins actually had religious services performed on their grounds on Sunday morning and evening, before the show.
Eventually, the rising value of real estate made the large property areas increasingly expensive for drive-ins to operate successfully. Widespread adoption of daylight saving time subtracted an hour from outdoor evening viewing time. These changes and the advent of color televisions, VCRs and video rentals led to a sharp decline in the popularity of drive-ins. They eventually lapsed into a quasi-novelty status with the remaining handful catering to a generally nostalgic audience, though many drive-ins continue to successfully operate in isolated areas. Many drive-in movie sites remain, as flea markets or other businesses.
In 2002, groups of dedicated individuals began to organize so-called "guerilla drive-ins" and "guerilla walk-ins" in parking lots and empty fields. Showings are often organized online, and participants meet at specified locations to watch films projected on bridge pillars or warehouses.
The best known guerilla drive-ins include the Santa Cruz Guerilla Drive-In in Santa Cruz, California, MobMov in Berkeley, California and Hollywood MobMov in Los Angeles, California, and most recently Guerilla Drive-In Victoria in Victoria, BC.
The Bell Museum of Natural History in Minneapolis, Minnesota has recently begun summer "bike-ins," inviting only pedestrians or people on bicycles onto the grounds for both live music and movies. In various Canadian cities, al-fresco movies projected on the walls of buildings or temporarily erected screens in parks operate during the Summer and cater to a pedestrian audience. (info from Wikipedia) (Kansas theater photo from www.reeldiaries.com)
Wednesday, June 6, 2007
395: death of last emperor of Roman Empire
Flavius Theodosius (347 - 395), also called Theodosius I and Theodosius the Great, was Roman Emperor from 379-395. Reuniting the eastern and western portions of the empire, Theodosius was the last emperor of both the Eastern and Western Roman Empire. After his death, the two parts split permanently. He is also known for making Christianity the official state religion of the Roman Empire.
Born in Spain, to a senior military officer, Theodosius the Elder, Theodosius accompanied his father to Britannia to help quell the Great Conspiracy in 368. He was military commander (dux) of Moesia, a Roman province on the lower Danube, in 374.
From 364 to 375, the Roman Empire was governed by two co-emperors, the brothers Valentinian I and Valens; when Valentinian died in 375, his sons, Valentinian II and Gratian, succeeded him as rulers of the Western Roman Empire.
In 378, after Valens was killed in the Battle of Adrianople, Gratian appointed Theodosius to replace the fallen emperor as co-augustus for the East. Gratian was killed in a rebellion in 383.
After the death in 392 of Valentinian II, whom Theodosius had supported against a variety of usurpations, Theodosius ruled as sole emperor, defeating the usurper Eugenius in 394, at the Battle of the Frigidus (modern Slovenia).
The Western Roman Empire is the western half of the Roman Empire after its division by Diocletian in 286. The capital of the eastern part was Nicomedia, and from 330 was Constantinople. The capital of the western part was Mediolanum (now Milan), and from 402 was Ravenna. The Western Empire existed intermittently in several periods between the 3rd century and the 5th century, after Diocletian's Tetrarchy and the reunifications associated with Constantine the Great, and Julian the Apostate.
Theodosius I was the last Roman Emperor who ruled over a unified Roman empire. After his death in 395, the Roman Empire was permanently divided. The Western Roman Empire ended officially with the abdication of Romulus Augustus under pressure of Odoacer in 476, and unofficially with the death of Julius Nepos in 480.
The Eastern Roman Empire (known as the Byzantine Empire) survived for another millennium before being eventually conquered by the Ottoman Empire in 1453.
As the Western Roman Empire fell, a new era began in Western European history: the Middle Ages. (info from Wikipedia)
Born in Spain, to a senior military officer, Theodosius the Elder, Theodosius accompanied his father to Britannia to help quell the Great Conspiracy in 368. He was military commander (dux) of Moesia, a Roman province on the lower Danube, in 374.
From 364 to 375, the Roman Empire was governed by two co-emperors, the brothers Valentinian I and Valens; when Valentinian died in 375, his sons, Valentinian II and Gratian, succeeded him as rulers of the Western Roman Empire.
In 378, after Valens was killed in the Battle of Adrianople, Gratian appointed Theodosius to replace the fallen emperor as co-augustus for the East. Gratian was killed in a rebellion in 383.
After the death in 392 of Valentinian II, whom Theodosius had supported against a variety of usurpations, Theodosius ruled as sole emperor, defeating the usurper Eugenius in 394, at the Battle of the Frigidus (modern Slovenia).
The Western Roman Empire is the western half of the Roman Empire after its division by Diocletian in 286. The capital of the eastern part was Nicomedia, and from 330 was Constantinople. The capital of the western part was Mediolanum (now Milan), and from 402 was Ravenna. The Western Empire existed intermittently in several periods between the 3rd century and the 5th century, after Diocletian's Tetrarchy and the reunifications associated with Constantine the Great, and Julian the Apostate.
Theodosius I was the last Roman Emperor who ruled over a unified Roman empire. After his death in 395, the Roman Empire was permanently divided. The Western Roman Empire ended officially with the abdication of Romulus Augustus under pressure of Odoacer in 476, and unofficially with the death of Julius Nepos in 480.
The Eastern Roman Empire (known as the Byzantine Empire) survived for another millennium before being eventually conquered by the Ottoman Empire in 1453.
As the Western Roman Empire fell, a new era began in Western European history: the Middle Ages. (info from Wikipedia)
Tuesday, June 5, 2007
1885: first thermostat to control room temperature
Albert M. Butz developed a thermostatic system in Minneapolis which automatically adjusted room temperatures in residential buildings. The patents were registered in 1885 and the "Butz Thermo-Electric Regulator Company" was established. It was the first company to offer the product for the automatic regulation of internal temperatures in buildings.
In 1888, Butz left Minneapolis, and his patents were held by attorneys in the "Consolidated Temperature Controlling Company". Four years later, the name was changed to the "Electric Thermostat Company", and William R. Sweatt assumed management of the business. Ten years later, he became sole proprietor of the business, operating under the name "Electric Heat Regulator Company".
The company built its first factory in Minneapolis in 1912, and changed its name to the "Minneapolis Heat Regulator Company".
In 1924, Mark C. Honeywell, a pioneer in automation technology, developed clock-controlled thermostats in his company, "Honeywell Heating Specialties Company".
Three years later, Sweatt and Honeywell merged their two companies together under the name of the "Minneapolis-Honeywell Regulator Company." Sales reached $5.25 million in 1928, and $100 million in 1945 and $1 billion in 1967.
Today the company is called "Honeywell International" and based in Morristown, New Jersey. After multiple acquisitions, expansions, reorganizations and mergers, it has a huge range of products and over 116,000 employees and annual sales over $30 billion. (info from Honeywell and Wikipedia)
In 1888, Butz left Minneapolis, and his patents were held by attorneys in the "Consolidated Temperature Controlling Company". Four years later, the name was changed to the "Electric Thermostat Company", and William R. Sweatt assumed management of the business. Ten years later, he became sole proprietor of the business, operating under the name "Electric Heat Regulator Company".
The company built its first factory in Minneapolis in 1912, and changed its name to the "Minneapolis Heat Regulator Company".
In 1924, Mark C. Honeywell, a pioneer in automation technology, developed clock-controlled thermostats in his company, "Honeywell Heating Specialties Company".
Three years later, Sweatt and Honeywell merged their two companies together under the name of the "Minneapolis-Honeywell Regulator Company." Sales reached $5.25 million in 1928, and $100 million in 1945 and $1 billion in 1967.
Today the company is called "Honeywell International" and based in Morristown, New Jersey. After multiple acquisitions, expansions, reorganizations and mergers, it has a huge range of products and over 116,000 employees and annual sales over $30 billion. (info from Honeywell and Wikipedia)
Monday, June 4, 2007
2007: US cars outsell trucks for first time since 2002
More cars than light trucks were sold in the United States last month, as gasoline prices soared to more than $3 a gallon.
Of the 1.56 million vehicles sold in May, 778,651 were cars and 777,296 were light trucks, including pickups, according Ward’s AutoInfoBank, which tracks industry statistics.
The last time cars outsold light trucks was in May 2002, according to Autodata, another statistics firm. But light trucks have routinely been outselling cars each month since 1997, when consumers’ tastes for big vehicles tipped the scales in their favor.
Toyota, Nissan, General Motors and Chrysler all said their sales rose in May compared with 2006, primarily because of stronger car sales. Toyota, which has benefited most among the major companies from the rise in gas prices, said its May sales set a monthly record, up 9.7 percent from a year ago. The Ford Motor Company was the only major player to be left out: its sales fell 11.7 percent from 2006.
Among the most popular cars, sales of the Honda Civic rose 32.6 percent in May to set a new record, while sales of the Chevrolet Impala rose 44.7 percent.
Sales of the hybrid-electric Toyota Prius rose 184.9 percent, according to Autodata. Dealers in parts of the country are again reporting waiting lists for the fuel-efficient model, even though Toyota has doubled production from last year.
But this latest shift does not mean Detroit auto companies are rushing to build more cars. Ford, for example, said it would build twice as many light trucks as cars during July, August and September.
One reason, analysts said, is that trucks are far more profitable than cars, with the typical midsize S.U.V. still generating about $4,000 a vehicle in gross profits, versus about $400 for a subcompact car.
But with SUVs falling out of favor, Detroit companies are heavily promoting crossover vehicles, which are sport utilities built on the underpinnings of cars, not pickups. Ford, in fact, has said its turnaround plan rests on the success of the Edge, a crossover introduced in December. It expects to sell about 120,000 this year.
Not all trucks are selling poorly. Sales of the Chevrolet Silverado rose 10.9 percent in May compared with 2006, outselling its biggest rival, the Ford F-series, whose sales fell 15.1 percent last year.
Over all, vehicle sales were up 0.7 percent from May 2006, according to Ward’s. Car sales were up 2.2 percent, while truck sales slipped 0.7 percent.
Despite a sales decline at Ford, the company is adapting to the buyers’ changing tastes, said its sales analyst, George Pipas. Three years ago, cars accounted for only 30 percent of its sales in the United States, but now are close to half. (info from The New York Times)
Of the 1.56 million vehicles sold in May, 778,651 were cars and 777,296 were light trucks, including pickups, according Ward’s AutoInfoBank, which tracks industry statistics.
The last time cars outsold light trucks was in May 2002, according to Autodata, another statistics firm. But light trucks have routinely been outselling cars each month since 1997, when consumers’ tastes for big vehicles tipped the scales in their favor.
Toyota, Nissan, General Motors and Chrysler all said their sales rose in May compared with 2006, primarily because of stronger car sales. Toyota, which has benefited most among the major companies from the rise in gas prices, said its May sales set a monthly record, up 9.7 percent from a year ago. The Ford Motor Company was the only major player to be left out: its sales fell 11.7 percent from 2006.
Among the most popular cars, sales of the Honda Civic rose 32.6 percent in May to set a new record, while sales of the Chevrolet Impala rose 44.7 percent.
Sales of the hybrid-electric Toyota Prius rose 184.9 percent, according to Autodata. Dealers in parts of the country are again reporting waiting lists for the fuel-efficient model, even though Toyota has doubled production from last year.
But this latest shift does not mean Detroit auto companies are rushing to build more cars. Ford, for example, said it would build twice as many light trucks as cars during July, August and September.
One reason, analysts said, is that trucks are far more profitable than cars, with the typical midsize S.U.V. still generating about $4,000 a vehicle in gross profits, versus about $400 for a subcompact car.
But with SUVs falling out of favor, Detroit companies are heavily promoting crossover vehicles, which are sport utilities built on the underpinnings of cars, not pickups. Ford, in fact, has said its turnaround plan rests on the success of the Edge, a crossover introduced in December. It expects to sell about 120,000 this year.
Not all trucks are selling poorly. Sales of the Chevrolet Silverado rose 10.9 percent in May compared with 2006, outselling its biggest rival, the Ford F-series, whose sales fell 15.1 percent last year.
Over all, vehicle sales were up 0.7 percent from May 2006, according to Ward’s. Car sales were up 2.2 percent, while truck sales slipped 0.7 percent.
Despite a sales decline at Ford, the company is adapting to the buyers’ changing tastes, said its sales analyst, George Pipas. Three years ago, cars accounted for only 30 percent of its sales in the United States, but now are close to half. (info from The New York Times)
Friday, June 1, 2007
1946: invention of the bikini
Although apparel resembling the bikini appears on ancient pottery, the “official” bikini was invented by French engineer Louis Réard and fashion designer Jacques Heim in 1946, and introduced at a fashion show in Paris. It was named after Bikini Atoll in the Pacific, the site of nuclear weapon tests a few days earlier, based on the assumption that the burst of excitement it would cause would be like an atomic bomb.
Reard's suit was a refinement of the work of Jacques Heim who, two months earlier, had introduced the "Atome" (named for its size) and advertised it as the world's smallest bathing suit. Reard 'split the "atome"' even smaller, but could not find a model who would dare to wear his design. He ended up hiring a nude dancer from the Casino de Paris as his model.
Bikini-style swimwear existed for many years before the first official bikini. The July 9, 1945, issue of Life magazine shows women in Paris wearing similar swimsuits. Films from Germany in the 1930s show women wearing two-piece bathing suits. The Busby Berkeley film spectacle, Footlight Parade of 1932 includes aquachoreography that featured bikini-like swimwear. They were seen again a year later in Gold Diggers of 1933.
In 1951 bikinis were banned from the Miss World Contest following the crowning of Miss Sweden in a bikini and subsequent protests with a number of countries threatening to withdraw. In 1957, however, Brigitte Bardot's bikini in And God Created Woman created a market for the swimwear in the US, and in 1960, Brian Hyland's pop song Itsy Bitsy Teenie Weenie Yellow Polka Dot Bikini inspired a bikini-buying spree.
In 1962, an icon was born as Bond Girl Ursula Andress emerged from the sea wearing a white bikini in Dr. No. Finally the bikini caught on, and by 1963, the movie Beach Party, starring Annette Funicello (emphatically not in a bikini, by mentor Walt Disney's personal request) and Frankie Avalon, led a wave of films that made the bikini a pop-culture symbol.
In recent years, the term monokini has come into use for topless bathing by women: where the bikini has two parts, the monokini is the lower part. When monokinis are in use, the word bikini may jokingly refer to a two-piece outfit consisting of a monokini and a sun hat.
The tankini is a swimsuit combining a tank top and a bikini bottom.
The lower part of the bikini was further reduced in size in the 1970s to the Brazilian thong, where the back of the suit is so thin that it disappears into the butt crack.
The sex appeal of the apparel prompted numerous film and television productions as soon as public morals changed to accept it. They include the numerous surf movies of the early 1960s and the television series, Baywatch. Iconic portrayals of bikinis in movies include the previously mentioned Ursula Andress as Bond girl Honey Ryder in Dr. No (1962), Raquel Welch as the prehistoric cavegirl in the 1966 film One Million Years B.C., and Phoebe Cates in the 1982 teen film Fast Times at Ridgemont High. These scenes were recently ranked 1, 86, and 84 in Channel 4 (UK)'s 100 Greatest Sexy Moments (in film).
In addition, a variant of the bikini popular in fantasy literature is a bikini that is made up of metal to serve as (admittedly rather impractical) armor, sometimes referred to as a "chain mail bikini" or "brass bikini"; the character Red Sonja is a famous example.
In science fiction, Star Wars Episode VI: Return of the Jedi features a metal bikini costume, worn by Princess Leia when she is held captive by Jabba the Hut. This particular bikini has since been elevated to pop culture icon status, spawning various spoofs and parodies, including an episode of Friends; and even a dedicated fansite, Leia's Metal Bikini. (info from Wikipedia)
Reard's suit was a refinement of the work of Jacques Heim who, two months earlier, had introduced the "Atome" (named for its size) and advertised it as the world's smallest bathing suit. Reard 'split the "atome"' even smaller, but could not find a model who would dare to wear his design. He ended up hiring a nude dancer from the Casino de Paris as his model.
Bikini-style swimwear existed for many years before the first official bikini. The July 9, 1945, issue of Life magazine shows women in Paris wearing similar swimsuits. Films from Germany in the 1930s show women wearing two-piece bathing suits. The Busby Berkeley film spectacle, Footlight Parade of 1932 includes aquachoreography that featured bikini-like swimwear. They were seen again a year later in Gold Diggers of 1933.
In 1951 bikinis were banned from the Miss World Contest following the crowning of Miss Sweden in a bikini and subsequent protests with a number of countries threatening to withdraw. In 1957, however, Brigitte Bardot's bikini in And God Created Woman created a market for the swimwear in the US, and in 1960, Brian Hyland's pop song Itsy Bitsy Teenie Weenie Yellow Polka Dot Bikini inspired a bikini-buying spree.
In 1962, an icon was born as Bond Girl Ursula Andress emerged from the sea wearing a white bikini in Dr. No. Finally the bikini caught on, and by 1963, the movie Beach Party, starring Annette Funicello (emphatically not in a bikini, by mentor Walt Disney's personal request) and Frankie Avalon, led a wave of films that made the bikini a pop-culture symbol.
In recent years, the term monokini has come into use for topless bathing by women: where the bikini has two parts, the monokini is the lower part. When monokinis are in use, the word bikini may jokingly refer to a two-piece outfit consisting of a monokini and a sun hat.
The tankini is a swimsuit combining a tank top and a bikini bottom.
The lower part of the bikini was further reduced in size in the 1970s to the Brazilian thong, where the back of the suit is so thin that it disappears into the butt crack.
The sex appeal of the apparel prompted numerous film and television productions as soon as public morals changed to accept it. They include the numerous surf movies of the early 1960s and the television series, Baywatch. Iconic portrayals of bikinis in movies include the previously mentioned Ursula Andress as Bond girl Honey Ryder in Dr. No (1962), Raquel Welch as the prehistoric cavegirl in the 1966 film One Million Years B.C., and Phoebe Cates in the 1982 teen film Fast Times at Ridgemont High. These scenes were recently ranked 1, 86, and 84 in Channel 4 (UK)'s 100 Greatest Sexy Moments (in film).
In addition, a variant of the bikini popular in fantasy literature is a bikini that is made up of metal to serve as (admittedly rather impractical) armor, sometimes referred to as a "chain mail bikini" or "brass bikini"; the character Red Sonja is a famous example.
In science fiction, Star Wars Episode VI: Return of the Jedi features a metal bikini costume, worn by Princess Leia when she is held captive by Jabba the Hut. This particular bikini has since been elevated to pop culture icon status, spawning various spoofs and parodies, including an episode of Friends; and even a dedicated fansite, Leia's Metal Bikini. (info from Wikipedia)
Subscribe to:
Posts (Atom)