With a name borrowed from an automobile tire sold in the Sears-Roebuck catalog and $700,000 in Sears funding, the Allstate Insurance Company began on April 17, 1931. On May 17, tool and die maker William Lehnertz became Allstate's first customer, paying $41.60 for a 12-month policy on his Studebaker.
At the Chicago World's Fair in 1933, Richard Roskam became the first Allstate agent when he set up a card table in the Sears exhibit and was swamped with applications. In 1934, Allstate changed its sales methods from direct-mail to Allstate-agent representation, with the opening of the first Allstate sales location in a Chicago Sears store.
In 1939 Allstate startled the insurance industry by tailoring auto rates by age, mileage and use of car. The plan was so popular, that the industry imitated. It didn't take long before Allstate dominated the American auto insurance business.
The highpoint for the Allstate brand was in the 1950s and 1960s, when it appeared on a wide range of products, including garage door openers, fire extinguishers, motor scooters and camper shells. In these years, before seatbelts, heaters, radios, and air conditioners became standard equipment on automobiles, Sears offered a complete line of these accessories under the Allstate brand.
By the end of the 1960s, Sears limited the Allstate brand name to insurance, tires, and automobile batteries. By the mid-1970s, Sears no longer used the Allstate brand on merchandise. In 1995, Allstate became independent, ending Sears' 70-year relationship with the brand it created.
Today, the Allstate Corporation is the nation’s largest publicly held personal lines insurer, with $157.5 billion in assets. (info from Allstate and Sears)
PERSONAL NOTE: Back in 1972, I lived in a nice apartment in a crappy part of the Bronx. I assumed I could not possibly get insurance for the contents of the apartment. One evening, I was in a Sears store, and while waiting for my wife to finish shopping, I wandered over to the Allstate counter. I was both surprised and thrilled to find out that I could get insurance, at a very reasonable price. I stayed in Allstate's good hands for more than 30 years, in four homes. I had big claims for damage from a slow-leaking washing machine, and a fast-leaking fish tank and kitchen sink, and a burglary; and was treated very fairly. I probably got back every penny I paid in premiums.
Thursday, May 31, 2007
Wednesday, May 30, 2007
1977: Saturday Night Fever infects the world
Saturday Night Fever starred John Travolta as Tony Manero, a troubled Brooklyn youth whose weekend activities are dominated by visits to a Brooklyn discotheque. While in the disco, Tony is the king, and the disco helps him to temporarily forget the reality of his life: a dead-end job, clashes with his unsupportive and squabbling parents, racial tensions in the local community, and dead-beat friends.
The movie significantly helped to popularize disco music around the world, and made Travolta a household name. The Saturday Night Fever soundtrack, featuring disco songs by the Bee Gees, became the best selling soundtrack ever.
The film also showcased aspects of the music, the dancing, and the subculture surrounding the disco era: symphony-orchestrated melodies, haute-couture styles of clothing, sexual promiscuity, and graceful choreography.
The story is based upon a 1976 New York magazine article by British writer Nik Cohn, "Tribal Rites of the New Saturday Night." In the late-1990s, Cohn acknowledged that the article had been fabricated. A newcomer to the United States and a stranger to the disco lifestyle, Cohn was unable to make any sense of the subculture he had been assigned to write about. The characters who were to become Tony Manero and his friends sprang almost completely from his imagination.
The film is also notable for being one of the first instances of cross media marketing, with the tie-in soundtrack's single being used to help promote the film before its release and the film popularizing the entire soundtrack after its release.
The story of the film has Tony Manero connect with the aloof Stephanie (Karen Lynn Gorney) one night at the disco. Despite her initial frosty and superior attitude toward Tony, she agrees to partner with him in the dance contest after much urging. Tony had previously agreed to dance with Annette, who had actively pursued Tony, despite his obvious disdain for her. Stephanie has a job in Manhattan as a secretary for a magazine and is poised to move there and has more opportunities to work her way up. This awakens in Tony the need to transcend his working-class roots of Bay Ridge, Brooklyn. However, Stephanie herself ultimately reveals her own vulnerabilities.
Also examined through the film is Tony's relationship with his family, including an older brother (who appears to be his parents clear favorite child) who abandons a planned career in the priesthood.
The unsentimental depiction of the subculture of the main characters contrasts with Travolta's follow-up film, the sanitized Grease (1978).
There were two theatrically-released versions of the film: the "original" R version and the PG "edited version." The PG-rated version was released in 1978 as an attempt to attract a more youthful audience. It is shorter than the original, with profanity replaced by separately-filmed scenes that substituted milder language, and with several scenes shortened or cut. Both theatrical versions were released on VHS, but only the R-rated version was released on LaserDisc and later on DVD, and the DVD version is shown in widescreen only. The R-rated version contains scenes of profanity, nudity, drug use and a date-rape scene which has been de-emphasised or completely removed from the PG version.
A sequel, Staying Alive, was released in 1983. It starred John Travolta and was directed by Sylvester Stallone.
Saturday Night Fever was the favorite movie of the late film critic Gene Siskel, who claimed to have seen it 17 times. He liked the movie so much, he bought the famous white disco suit (worn by Travolta in the movie) at a charity auction for $17,000.
John Travolta still has the pair of high-heeled shoes he wore during the opening and dance sequences of the film (as depicted in the poster). He says he sometimes takes them out of the closet, but claims he doesn't wear them. (info from Wikipedia)
The movie significantly helped to popularize disco music around the world, and made Travolta a household name. The Saturday Night Fever soundtrack, featuring disco songs by the Bee Gees, became the best selling soundtrack ever.
The film also showcased aspects of the music, the dancing, and the subculture surrounding the disco era: symphony-orchestrated melodies, haute-couture styles of clothing, sexual promiscuity, and graceful choreography.
The story is based upon a 1976 New York magazine article by British writer Nik Cohn, "Tribal Rites of the New Saturday Night." In the late-1990s, Cohn acknowledged that the article had been fabricated. A newcomer to the United States and a stranger to the disco lifestyle, Cohn was unable to make any sense of the subculture he had been assigned to write about. The characters who were to become Tony Manero and his friends sprang almost completely from his imagination.
The film is also notable for being one of the first instances of cross media marketing, with the tie-in soundtrack's single being used to help promote the film before its release and the film popularizing the entire soundtrack after its release.
The story of the film has Tony Manero connect with the aloof Stephanie (Karen Lynn Gorney) one night at the disco. Despite her initial frosty and superior attitude toward Tony, she agrees to partner with him in the dance contest after much urging. Tony had previously agreed to dance with Annette, who had actively pursued Tony, despite his obvious disdain for her. Stephanie has a job in Manhattan as a secretary for a magazine and is poised to move there and has more opportunities to work her way up. This awakens in Tony the need to transcend his working-class roots of Bay Ridge, Brooklyn. However, Stephanie herself ultimately reveals her own vulnerabilities.
Also examined through the film is Tony's relationship with his family, including an older brother (who appears to be his parents clear favorite child) who abandons a planned career in the priesthood.
The unsentimental depiction of the subculture of the main characters contrasts with Travolta's follow-up film, the sanitized Grease (1978).
There were two theatrically-released versions of the film: the "original" R version and the PG "edited version." The PG-rated version was released in 1978 as an attempt to attract a more youthful audience. It is shorter than the original, with profanity replaced by separately-filmed scenes that substituted milder language, and with several scenes shortened or cut. Both theatrical versions were released on VHS, but only the R-rated version was released on LaserDisc and later on DVD, and the DVD version is shown in widescreen only. The R-rated version contains scenes of profanity, nudity, drug use and a date-rape scene which has been de-emphasised or completely removed from the PG version.
A sequel, Staying Alive, was released in 1983. It starred John Travolta and was directed by Sylvester Stallone.
Saturday Night Fever was the favorite movie of the late film critic Gene Siskel, who claimed to have seen it 17 times. He liked the movie so much, he bought the famous white disco suit (worn by Travolta in the movie) at a charity auction for $17,000.
John Travolta still has the pair of high-heeled shoes he wore during the opening and dance sequences of the film (as depicted in the poster). He says he sometimes takes them out of the closet, but claims he doesn't wear them. (info from Wikipedia)
Tuesday, May 29, 2007
1889: first jukebox
Coin-operated music boxes and player pianos provided automatic pay-per-tune music in fairgrounds, amusement parks and other public places in the 1800s, decades before the introduction of reliable coin-operated phonographs.
Some of these automatic musical instruments were extremely well built and have survived until now. In the long run, they could not compete commercially with the jukebox, since they were limited to the instrument (or instruments) used in their construction, and could not reproduce the human voice.
One of the early forerunners to the modern Jukebox was the Nickel-in-the-Slot machine. In 1889, Louis Glass and William S. Arnold, placed a coin-operated Edison cylinder phonograph in the Palais Royale Saloon in San Francisco.
It was fitted with a coin mechanism patented by Glass and Arnold, and collected a nickel for each play. The machine had no amplification and patrons had to listen to the music using one of four listening tubes. In its first six months of service, the Nickel-in-the-Slot earned over $1000.
During the 1890s, recordings had become popular primarily through coin-in-the-slot phonographs in public places.
Initially, manufacturers did not call them "jukeboxes." They called them Automatic Coin-Operated Phonographs (or Automatic Phonographs, or Coin-Operated Phonographs). The term "jukebox" appeared in the 1930s and originated in the southern US. The term either derived from African-American slang "jook" meaning "dance", or a name given to it by critics who said it would encourage criminal behavior, referring to the fake family name "Juke." (info from About.com and Wikipedia)
Some of these automatic musical instruments were extremely well built and have survived until now. In the long run, they could not compete commercially with the jukebox, since they were limited to the instrument (or instruments) used in their construction, and could not reproduce the human voice.
One of the early forerunners to the modern Jukebox was the Nickel-in-the-Slot machine. In 1889, Louis Glass and William S. Arnold, placed a coin-operated Edison cylinder phonograph in the Palais Royale Saloon in San Francisco.
It was fitted with a coin mechanism patented by Glass and Arnold, and collected a nickel for each play. The machine had no amplification and patrons had to listen to the music using one of four listening tubes. In its first six months of service, the Nickel-in-the-Slot earned over $1000.
During the 1890s, recordings had become popular primarily through coin-in-the-slot phonographs in public places.
Initially, manufacturers did not call them "jukeboxes." They called them Automatic Coin-Operated Phonographs (or Automatic Phonographs, or Coin-Operated Phonographs). The term "jukebox" appeared in the 1930s and originated in the southern US. The term either derived from African-American slang "jook" meaning "dance", or a name given to it by critics who said it would encourage criminal behavior, referring to the fake family name "Juke." (info from About.com and Wikipedia)
Friday, May 25, 2007
1627: first snowman in North America
Plymouth Colony was an English colonial venture in the southeastern portion of the modern state of Massachusetts from 1620 until 1691.
Founded by separatists who came to be known as the Pilgrims, Plymouth Colony was one of the earliest colonies to be founded by the English in North America and the first sizable permanent English settlement in New England.
The colonists are known for their landing at Plymouth Rock, a treaty with Native Americans, and the first celebration of Thanksgiving. Recent research at Harvard University revealed that the colony was the site of the first snowman made in North America by European settlers.
Henrik Bjorn Oldenburg had built snowmen as a child in Norway before his family moved to England in 1615. He joined the crew of the Mayflower for the 1620 voyage to Plymouth, and built a snowman in front of a cabin used as a school house in Plymouth on February 4, 1627 for the amusement of the children. His great great great great great great great great great great grandson Peter Oldenburg built a snowman in front of the Harvard Library in 2007, to mark the 380th anniversary of the first snowman.
Founded by separatists who came to be known as the Pilgrims, Plymouth Colony was one of the earliest colonies to be founded by the English in North America and the first sizable permanent English settlement in New England.
The colonists are known for their landing at Plymouth Rock, a treaty with Native Americans, and the first celebration of Thanksgiving. Recent research at Harvard University revealed that the colony was the site of the first snowman made in North America by European settlers.
Henrik Bjorn Oldenburg had built snowmen as a child in Norway before his family moved to England in 1615. He joined the crew of the Mayflower for the 1620 voyage to Plymouth, and built a snowman in front of a cabin used as a school house in Plymouth on February 4, 1627 for the amusement of the children. His great great great great great great great great great great grandson Peter Oldenburg built a snowman in front of the Harvard Library in 2007, to mark the 380th anniversary of the first snowman.
Thursday, May 24, 2007
1981: first woman president
in Eastern Hemisphere
Soong Ch'ing-ling (1893 – 1981), also known as Madame Sun Yat-sen, was one of three sisters whose husbands were among China's most significant political figures of the early 20th century.
She was the daughter of a wealthy businessman and missionary, and graduated from Wesleyan College in the US. Ching-ling married Sun Yat-sen, who had previously been married. Her parents greatly opposed the match, as Dr. Sun was 26 years her senior. After Sun's death in 1925, she was elected to the Kuomintang Central Executive Committee in 1926.
In 1939, she founded the China Defense League, which later became the China Welfare Institute. The committee worked for peace and justice, and now focuses on maternal and pediatric healthcare, preschool education, and other children's issues.
In the early 1950s, she founded the magazine China Reconstructs, with the help of Israel Epstein. This magazine is published monthly in six languages.
After the establishment of the People's Republic of China, she became the Vice President ("Vice Chair"), Head of the Sino-Soviet Friendship Association and Honorary President of the All-China Women's Federation. In 1951 she was awarded the Stalin Peace Prize (called the "Lenin Peace Prize" after destalinization), and in 1953 a collection of her writings was published. From 1968 to 1972 she acted jointly with Dong Biwu as head of state.
On May 16, 1981, two weeks before her death, she was admitted to the Communist Party and was named Honorary President of the People's Republic of China. She is the only person ever to hold this title. (info from Wikipedia and terra.es)
She was the daughter of a wealthy businessman and missionary, and graduated from Wesleyan College in the US. Ching-ling married Sun Yat-sen, who had previously been married. Her parents greatly opposed the match, as Dr. Sun was 26 years her senior. After Sun's death in 1925, she was elected to the Kuomintang Central Executive Committee in 1926.
In 1939, she founded the China Defense League, which later became the China Welfare Institute. The committee worked for peace and justice, and now focuses on maternal and pediatric healthcare, preschool education, and other children's issues.
In the early 1950s, she founded the magazine China Reconstructs, with the help of Israel Epstein. This magazine is published monthly in six languages.
After the establishment of the People's Republic of China, she became the Vice President ("Vice Chair"), Head of the Sino-Soviet Friendship Association and Honorary President of the All-China Women's Federation. In 1951 she was awarded the Stalin Peace Prize (called the "Lenin Peace Prize" after destalinization), and in 1953 a collection of her writings was published. From 1968 to 1972 she acted jointly with Dong Biwu as head of state.
On May 16, 1981, two weeks before her death, she was admitted to the Communist Party and was named Honorary President of the People's Republic of China. She is the only person ever to hold this title. (info from Wikipedia and terra.es)
Wednesday, May 23, 2007
1974: first woman president
in Western Hemisphere
María Estela Martínez Cartas was born in Argentina in 1931. She became a nightclub dancer in the early 1950s, adopting a non-Spanish variant of her saint's name, Isabela, or Isabel, as her stage name. She met her future husband Juan Domingo Perón during his exile in Panama. Perón, who was 35 years her senior, was attracted to her beauty and believed she could provide him with the female companionship he had been lacking since the death of his second wife, Eva ("Evita"). Isabel soon gave up her career in show business and became Perón's personal secretary.
Perón brought Isabel with him when he moved to Spain, in 1960. Authorities in the Roman Catholic nation did not approve of Perón's living arrangements with this young woman, so in 1961 the former president reluctantly got married for a third time.
As Perón began to return to an active role in Argentine politics, Isabel would often be used as a go-between from Spain to South America. Having been deposed in a coup years prior, Perón was forbidden from returning to Argentina, so his new wife would travel for him and report back to him.
It was also around this time that Isabel met José López Rega, an occult "philosopher" and fortune teller, and also founder of the Argentine Anticommunist Alliance, a death squad which later killed 1,500 under Isabel's Presidency. Isabel was quite interested in the occult, and the two quickly became friends. Under pressure from Isabel, Perón appointed López as his personal secretary.
Héctor Cámpora was nominated by Perón's Justicialist Party to run in the 1973 presidential elections, and won. However, it was generally understood that Perón held the real power. Later that year, Perón was persuaded to return to Argentina. Cámpora resigned to allow Perón to run for president.
In a surprisingly uncontroversial move, he chose Isabel as his running mate. Perón's return from exile was marked by a growing rift between the right and left wings of the Peronist movement. Cámpora represented the left wing, while López Rega represented the right wing. Under López Rega's influence, Juan and Isabel Perón favored the right wing. Isabel had very little in the way of political experience or ambitions, and she was a very different personality from Evita, who was more into politics and who had been denied the post of vice president years earlier.
Juan Perón died on July 1, 1974, less than a year after his third election to the presidency. Isabel assumed the position and became the first non-royal female head of state and head of government in the West.
Unlike Evita, who was almost a demigoddess in Argentina, Isabel was very unpopular. One factor was that López Rega, by this time minister of social welfare, had so much influence over Isabel that he was a de facto prime minister. Despite his right-wing views, his status as the power behind the throne greatly frightened the military. Rodolfo Almirón, arrested in 2006 in Spain, was in charge of López Rega's and Isabel Perón's personal security.
Isabel agreed to fire López, but the military concluded that with the prevailing climate of widespread strikes and political terrorism, a "weak-willed and inexperienced woman" would not be a suitable President. Her time in power resulted in a spike in the inflationary rate and this did not help her case.
On March 24, 1976, she was deposed in a bloodless coup. After remaining under house arrest for five years, she was sent into exile in Spain in 1981, and returned briefly to Argentina in 1984, shortly after democracy was restored. (info from Wikipedia and terra.es)
Perón brought Isabel with him when he moved to Spain, in 1960. Authorities in the Roman Catholic nation did not approve of Perón's living arrangements with this young woman, so in 1961 the former president reluctantly got married for a third time.
As Perón began to return to an active role in Argentine politics, Isabel would often be used as a go-between from Spain to South America. Having been deposed in a coup years prior, Perón was forbidden from returning to Argentina, so his new wife would travel for him and report back to him.
It was also around this time that Isabel met José López Rega, an occult "philosopher" and fortune teller, and also founder of the Argentine Anticommunist Alliance, a death squad which later killed 1,500 under Isabel's Presidency. Isabel was quite interested in the occult, and the two quickly became friends. Under pressure from Isabel, Perón appointed López as his personal secretary.
Héctor Cámpora was nominated by Perón's Justicialist Party to run in the 1973 presidential elections, and won. However, it was generally understood that Perón held the real power. Later that year, Perón was persuaded to return to Argentina. Cámpora resigned to allow Perón to run for president.
In a surprisingly uncontroversial move, he chose Isabel as his running mate. Perón's return from exile was marked by a growing rift between the right and left wings of the Peronist movement. Cámpora represented the left wing, while López Rega represented the right wing. Under López Rega's influence, Juan and Isabel Perón favored the right wing. Isabel had very little in the way of political experience or ambitions, and she was a very different personality from Evita, who was more into politics and who had been denied the post of vice president years earlier.
Juan Perón died on July 1, 1974, less than a year after his third election to the presidency. Isabel assumed the position and became the first non-royal female head of state and head of government in the West.
Unlike Evita, who was almost a demigoddess in Argentina, Isabel was very unpopular. One factor was that López Rega, by this time minister of social welfare, had so much influence over Isabel that he was a de facto prime minister. Despite his right-wing views, his status as the power behind the throne greatly frightened the military. Rodolfo Almirón, arrested in 2006 in Spain, was in charge of López Rega's and Isabel Perón's personal security.
Isabel agreed to fire López, but the military concluded that with the prevailing climate of widespread strikes and political terrorism, a "weak-willed and inexperienced woman" would not be a suitable President. Her time in power resulted in a spike in the inflationary rate and this did not help her case.
On March 24, 1976, she was deposed in a bloodless coup. After remaining under house arrest for five years, she was sent into exile in Spain in 1981, and returned briefly to Argentina in 1984, shortly after democracy was restored. (info from Wikipedia and terra.es)
Tuesday, May 22, 2007
1976: Caller ID
In 1976, Japanese inventor Kazuo Hashimoto built a prototype of a caller ID display device.
In 1982, a patent for Caller ID was filed by Carolyn Doughty of Bell Laboratories, then part of AT&T (the old AT&T, not the new one).
Initially, the operating telephone companies wanted to have the caller ID function performed by the central office as a voice announcement and charged on a per call basis. John Harris, an employee of Northern Telecom's telephone set manufacturing division in Ontario promoted the idea of having caller ID on a telephone set display.
The first market trial for caller ID and other "TouchStar" services was on July 7, 1984 in Orlando, Florida. Ellis D. Hill, the head of the BellSouth Product team, coined the term "caller ID."
In 1987, Bell Atlantic conducted another market trial in Hudson County, New Jersey, which was followed by limited deployment. BellSouth began the first commercial application of caller ID in December 1988 in Memphis, Tennessee and was the first regional Bell to fully deploy the system. Later enhancements include the providing of name with number, caller ID on call waiting, and talking caller ID.
Originally, caller ID was provided from an accessory box set up near a phone. Today, the feature is built into many phones. Some phone companies charge extra for the service, but it is usually free with cellular and broadband phone service. (some info from About.com and Wikipedia)
In 1982, a patent for Caller ID was filed by Carolyn Doughty of Bell Laboratories, then part of AT&T (the old AT&T, not the new one).
Initially, the operating telephone companies wanted to have the caller ID function performed by the central office as a voice announcement and charged on a per call basis. John Harris, an employee of Northern Telecom's telephone set manufacturing division in Ontario promoted the idea of having caller ID on a telephone set display.
The first market trial for caller ID and other "TouchStar" services was on July 7, 1984 in Orlando, Florida. Ellis D. Hill, the head of the BellSouth Product team, coined the term "caller ID."
In 1987, Bell Atlantic conducted another market trial in Hudson County, New Jersey, which was followed by limited deployment. BellSouth began the first commercial application of caller ID in December 1988 in Memphis, Tennessee and was the first regional Bell to fully deploy the system. Later enhancements include the providing of name with number, caller ID on call waiting, and talking caller ID.
Originally, caller ID was provided from an accessory box set up near a phone. Today, the feature is built into many phones. Some phone companies charge extra for the service, but it is usually free with cellular and broadband phone service. (some info from About.com and Wikipedia)
Monday, May 21, 2007
2005: Swiss Army knife makers stop fighting
A Swiss Army knife is a multi-function pocket tool with a knife blade and other tools, such as screwdriver, tweezers and can opener. It originated in Switzerland in 1897, but the term is not a government-protected designation of origin, and any knife can claim to be a Swiss army knife. The term "Swiss Army knife" was coined by US soldiers after World War II, as they couldn't pronounce its original name, "Offiziersmesser".
The most common tools featured are, in addition to the main blade, a smaller second blade, tweezers, toothpick, corkscrew, can opener, bottle opener, phillips-head screwdriver, nail file, scissors, saw, file, hook, magnifying glass, ballpoint pen, fish scaler, pliers and key chain. Recent technological features include USB flash drives, digital clock, digital altimeter, LED light, laser pointer, and MP3 player. The official army model also contains a brass spacer, which allows the knife, with the screwdriver and the reamer extended at the same time, to be used to assemble the Swiss Army assault rifles.
In 1891, Karl Elsener, then owner of a company that made surgical equipment, discovered to his dismay that the pocket knives supplied to the Swiss Army were made in Germany. Upset, he founded the Association of Swiss Master Cutlers to produce Swiss knives for the Swiss Army.
The original had a wooden handle and featured a blade, a screwdriver, a can opener, and a punch. This knife was sold to the Swiss army, but Elsener was not satisfied. In 1896, he managed to put blades on both sides of the handle using a special spring mechanism. This allowed Elsener to put twice as many features on the knife, and he added a second blade and a corkscrew.
Elsener, through his company Victorinox, managed to corner the market until 1893, when the second industrial cutlery firm, Paul Boechat & Cie, started selling a similar product. This company was later acquired by its general manager, Theodore Wenger and renamed the Wenger Company.
In 1908 the Swiss government, wanting to prevent an issue over regional favoritism, but perhaps wanting a bit of competition in hopes of lowering prices, split the contract with Victorinox and Wenger each getting half of the orders placed. By mutual agreement, Wenger advertises as the Genuine Swiss Army Knife and Victorinox uses the slogan the Original Swiss Army Knife.
In 2005, Victorinox acquired Wenger, thus once again becoming the sole supplier of knives to the Swiss Army. Victorinox has stated that it intends to keep using both brands for consumer products.
In 2006, Victorinox produced a knife with 85 devices and 110 functions, to commemorate Wenger's 100th anniversary in the Swiss Army knife business. The Giant is a novelty collector's item that is nearly 9 inches wide, and retails for about $1200
The Swiss Army Knife was an important part of the American TV show MacGyver (1985 - 1992), where MacGyver often improvised equipment needed to solve problems. He often used his knife to help build mechanisms out of common items, which led to sayings such as "making a rocket out of a matchbox and a paper clip." (info from Wikipedia)
The most common tools featured are, in addition to the main blade, a smaller second blade, tweezers, toothpick, corkscrew, can opener, bottle opener, phillips-head screwdriver, nail file, scissors, saw, file, hook, magnifying glass, ballpoint pen, fish scaler, pliers and key chain. Recent technological features include USB flash drives, digital clock, digital altimeter, LED light, laser pointer, and MP3 player. The official army model also contains a brass spacer, which allows the knife, with the screwdriver and the reamer extended at the same time, to be used to assemble the Swiss Army assault rifles.
In 1891, Karl Elsener, then owner of a company that made surgical equipment, discovered to his dismay that the pocket knives supplied to the Swiss Army were made in Germany. Upset, he founded the Association of Swiss Master Cutlers to produce Swiss knives for the Swiss Army.
The original had a wooden handle and featured a blade, a screwdriver, a can opener, and a punch. This knife was sold to the Swiss army, but Elsener was not satisfied. In 1896, he managed to put blades on both sides of the handle using a special spring mechanism. This allowed Elsener to put twice as many features on the knife, and he added a second blade and a corkscrew.
Elsener, through his company Victorinox, managed to corner the market until 1893, when the second industrial cutlery firm, Paul Boechat & Cie, started selling a similar product. This company was later acquired by its general manager, Theodore Wenger and renamed the Wenger Company.
In 1908 the Swiss government, wanting to prevent an issue over regional favoritism, but perhaps wanting a bit of competition in hopes of lowering prices, split the contract with Victorinox and Wenger each getting half of the orders placed. By mutual agreement, Wenger advertises as the Genuine Swiss Army Knife and Victorinox uses the slogan the Original Swiss Army Knife.
In 2005, Victorinox acquired Wenger, thus once again becoming the sole supplier of knives to the Swiss Army. Victorinox has stated that it intends to keep using both brands for consumer products.
In 2006, Victorinox produced a knife with 85 devices and 110 functions, to commemorate Wenger's 100th anniversary in the Swiss Army knife business. The Giant is a novelty collector's item that is nearly 9 inches wide, and retails for about $1200
The Swiss Army Knife was an important part of the American TV show MacGyver (1985 - 1992), where MacGyver often improvised equipment needed to solve problems. He often used his knife to help build mechanisms out of common items, which led to sayings such as "making a rocket out of a matchbox and a paper clip." (info from Wikipedia)
Friday, May 18, 2007
1997: first camera phone
Ten years ago, Philippe Kahn was in a hospital.
"We were going to have a baby and I wanted to share the pictures," Kahn said, "and there was no easy way to do it."
So as he sat in a maternity ward, he wrote a crude program on his laptop and sent an assistant to a RadioShack store to get a soldering iron and parts to connect his digital camera to his cellphone. When Sophie was born, he sent her photo over a cellular connection to acquaintances around the globe.
"It's had a massive impact because it's just so convenient," said Kahn, a tech industry maverick who founded software maker Borland, an early Microsoft rival.
"There's always a way to capture memories and share it," he said. "You go to a restaurant, and there's a birthday and suddenly everyone is getting their camera phones out. It's amazing."
A decade later, 41 percent of American households own a camera phone. Market researcher Gartner Inc. predicts that about 589 million cell phones will be sold with cameras in 2007, increasing to more than 1 billion worldwide by 2010.
The contraption Kahn assembled in 1997 has evolved into a pocket-friendly phenomenon that has empowered both citizen journalists and personal paparazzi.
It has prompted lawsuits - a student sued campus police at UCLA for alleged excessive force after officers were caught on cellphone video using a stun gun during his arrest; and been a catalyst for change - a government inquiry into police practices ensued in Malaysia after a cellphone video revealed a woman detainee being forced to do squats while naked.
On another scale, parents use cellphone slideshows - not wallet photos - to show off pictures of their children, while adolescents document their rites of passage with cell phone cameras and instantly share the images.
Kahn's makeshift photo-communications system formed the basis for a new company, LightSurf Technologies, which he later sold to VeriSign. LightSurf built "PictureMail" software and worked with cellphone makers to integrate the wireless photo technology.
Sharp was the first company to sell a commercial cellphone with a camera in Japan in 2000. Camera phones didn't debut in the US until 2002, Kahn said.
Though Kahn's work revolved around transmitting only still photographs, his groundbreaking implementation of the instant-sharing via a cellphone planted a seed.
Kahn is well aware of how the camera phone has since been put to negative uses: sneaky shots up women's skirts, or the violent trend of "happy slapping" in Europe where youths provoke a fight or assault, capture the incident on camera and then spread the images on the Web or between mobile phones.
But he likes to focus on the technology's benefits. It's been a handy tool that has led to vindication for victims or validation for vigilantes.
As Kahn heard the smattering of stories in recent years about assailants scared off by a camera phone or criminals who were nabbed later because their faces or their license plates were captured on the gadget, he said, "I started feeling it was better than carrying a gun."
And though he found the camera-phone video of the former Iraqi dictator's execution disturbing, Kahn said the gadget helped "get the truth out." The unofficial footage surreptitiously taken by a guard was vastly different from the government-issued version and revealed a chaotic scene with angry exchanges depicting the ongoing problems between the nation's factions. (info from The Wall Street Journal)
"We were going to have a baby and I wanted to share the pictures," Kahn said, "and there was no easy way to do it."
So as he sat in a maternity ward, he wrote a crude program on his laptop and sent an assistant to a RadioShack store to get a soldering iron and parts to connect his digital camera to his cellphone. When Sophie was born, he sent her photo over a cellular connection to acquaintances around the globe.
"It's had a massive impact because it's just so convenient," said Kahn, a tech industry maverick who founded software maker Borland, an early Microsoft rival.
"There's always a way to capture memories and share it," he said. "You go to a restaurant, and there's a birthday and suddenly everyone is getting their camera phones out. It's amazing."
A decade later, 41 percent of American households own a camera phone. Market researcher Gartner Inc. predicts that about 589 million cell phones will be sold with cameras in 2007, increasing to more than 1 billion worldwide by 2010.
The contraption Kahn assembled in 1997 has evolved into a pocket-friendly phenomenon that has empowered both citizen journalists and personal paparazzi.
It has prompted lawsuits - a student sued campus police at UCLA for alleged excessive force after officers were caught on cellphone video using a stun gun during his arrest; and been a catalyst for change - a government inquiry into police practices ensued in Malaysia after a cellphone video revealed a woman detainee being forced to do squats while naked.
On another scale, parents use cellphone slideshows - not wallet photos - to show off pictures of their children, while adolescents document their rites of passage with cell phone cameras and instantly share the images.
Kahn's makeshift photo-communications system formed the basis for a new company, LightSurf Technologies, which he later sold to VeriSign. LightSurf built "PictureMail" software and worked with cellphone makers to integrate the wireless photo technology.
Sharp was the first company to sell a commercial cellphone with a camera in Japan in 2000. Camera phones didn't debut in the US until 2002, Kahn said.
Though Kahn's work revolved around transmitting only still photographs, his groundbreaking implementation of the instant-sharing via a cellphone planted a seed.
Kahn is well aware of how the camera phone has since been put to negative uses: sneaky shots up women's skirts, or the violent trend of "happy slapping" in Europe where youths provoke a fight or assault, capture the incident on camera and then spread the images on the Web or between mobile phones.
But he likes to focus on the technology's benefits. It's been a handy tool that has led to vindication for victims or validation for vigilantes.
As Kahn heard the smattering of stories in recent years about assailants scared off by a camera phone or criminals who were nabbed later because their faces or their license plates were captured on the gadget, he said, "I started feeling it was better than carrying a gun."
And though he found the camera-phone video of the former Iraqi dictator's execution disturbing, Kahn said the gadget helped "get the truth out." The unofficial footage surreptitiously taken by a guard was vastly different from the government-issued version and revealed a chaotic scene with angry exchanges depicting the ongoing problems between the nation's factions. (info from The Wall Street Journal)
Thursday, May 17, 2007
1858: first use of fingerprints for identification
In 1858, Sir William Herschel, Chief Magistrate of the Hooghly district in Jungipoor, India, first used fingerprints on contracts. On a whim, and with no thought toward personal identification, Herschel had Rajyadhar Konai, a local businessman, impress his hand print on the back of a contract.
The idea was merely "...to frighten [him] out of all thought of repudiating his signature." Konai was suitably impressed, and Herschel made a habit of requiring palm prints -- and later, simply prints of the right index and middle fingers -- on every contract. Personal contact with the document, they believed, made the contract more binding than if they simply signed it. The first wide-scale, modern-day use of fingerprints was predicated, not upon scientific evidence, but upon superstition.
As his fingerprint collection grew, Herschel began to note that the inked impressions could prove or disprove identity. His conviction that all fingerprints were unique to the individual, as well as permanent throughout that individual's life, inspired him to expand their use.
During the 1870's, Dr. Henry Faulds, the British Surgeon-Superintendent of Tsukiji Hospital in Japan, took up the study of "skin-furrows" after noticing finger marks on specimens of ancient pottery. In 1880, Faulds forwarded an explanation of his classification system and a sample of the forms he had designed for recording inked impressions, to Sir Charles Darwin. Darwin, in advanced age and ill health, informed Faulds that he could be of no assistance, but promised to pass the materials on to his cousin, Francis Galton.
Also in 1880, Faulds published an article where he discussed fingerprints as a means of personal identification, and the use of printers' ink as a method for obtaining such fingerprints. He is also credited with the first fingerprint identification of a greasy fingerprint left on an alcohol bottle.
In 1882, Gilbert Thompson of the US Geological Survey in New Mexico, used his own fingerprints on a document to prevent forgery. This is the first known use of fingerprints in the United States.
In Mark Twain's book, Life on the Mississippi, a murderer was identified by the use of fingerprint identification. In a later Twain book Pudd'n Head Wilson, there was a dramatic court trial on fingerprint identification.
Sir Francis Galton, a British anthropologist and a cousin of Charles Darwin, began his observations of fingerprints as a means of identification in the 1880's. In 1892, he published his book, Fingerprints, establishing the individuality and permanence of fingerprints. The book included the first classification system for fingerprints.
Galton's primary interest in fingerprints was as an aid in determining heredity and racial background. While he discovered that fingerprints offered no firm clues to an individual's intelligence or genetic history, he was able to scientifically prove what Herschel and Faulds suspected: that fingerprints do not change over the course of an individual's lifetime, and that no two fingerprints are exactly the same. According to his calculations, the odds of two fingerprints being the same were 1 in 64 billion. Galton identified the characteristics by which fingerprints can be identified. These characteristics are in use today, and are often referred to as Galton's Details.
In 1891, Juan Vucetich, an Argentine Police Official, began the first fingerprint files based on Galton pattern types. In 1892, Vucetich made the first criminal fingerprint identification. He was able to identify a woman who had murdered her two sons, and cut her own throat in an attempt to place blame on someone else. Her bloody print was left on a door post, proving her identity as the murderer.
In 1924, Congress established the Identification Division of the FBI. By 1946, the FBI had processed 100 million fingerprint cards in manually maintained files; and by 1971, 200 million cards. With the introduction of AFIS (Automated Fingerprint Identification System), the files were split into computerized criminal files and manually maintained civil files. Many of the manual files were duplicates, so the records actually represented about 25 million criminals, and an unknown number of people in the civil files. (info from AlladinUSA.com)
The idea was merely "...to frighten [him] out of all thought of repudiating his signature." Konai was suitably impressed, and Herschel made a habit of requiring palm prints -- and later, simply prints of the right index and middle fingers -- on every contract. Personal contact with the document, they believed, made the contract more binding than if they simply signed it. The first wide-scale, modern-day use of fingerprints was predicated, not upon scientific evidence, but upon superstition.
As his fingerprint collection grew, Herschel began to note that the inked impressions could prove or disprove identity. His conviction that all fingerprints were unique to the individual, as well as permanent throughout that individual's life, inspired him to expand their use.
During the 1870's, Dr. Henry Faulds, the British Surgeon-Superintendent of Tsukiji Hospital in Japan, took up the study of "skin-furrows" after noticing finger marks on specimens of ancient pottery. In 1880, Faulds forwarded an explanation of his classification system and a sample of the forms he had designed for recording inked impressions, to Sir Charles Darwin. Darwin, in advanced age and ill health, informed Faulds that he could be of no assistance, but promised to pass the materials on to his cousin, Francis Galton.
Also in 1880, Faulds published an article where he discussed fingerprints as a means of personal identification, and the use of printers' ink as a method for obtaining such fingerprints. He is also credited with the first fingerprint identification of a greasy fingerprint left on an alcohol bottle.
In 1882, Gilbert Thompson of the US Geological Survey in New Mexico, used his own fingerprints on a document to prevent forgery. This is the first known use of fingerprints in the United States.
In Mark Twain's book, Life on the Mississippi, a murderer was identified by the use of fingerprint identification. In a later Twain book Pudd'n Head Wilson, there was a dramatic court trial on fingerprint identification.
Sir Francis Galton, a British anthropologist and a cousin of Charles Darwin, began his observations of fingerprints as a means of identification in the 1880's. In 1892, he published his book, Fingerprints, establishing the individuality and permanence of fingerprints. The book included the first classification system for fingerprints.
Galton's primary interest in fingerprints was as an aid in determining heredity and racial background. While he discovered that fingerprints offered no firm clues to an individual's intelligence or genetic history, he was able to scientifically prove what Herschel and Faulds suspected: that fingerprints do not change over the course of an individual's lifetime, and that no two fingerprints are exactly the same. According to his calculations, the odds of two fingerprints being the same were 1 in 64 billion. Galton identified the characteristics by which fingerprints can be identified. These characteristics are in use today, and are often referred to as Galton's Details.
In 1891, Juan Vucetich, an Argentine Police Official, began the first fingerprint files based on Galton pattern types. In 1892, Vucetich made the first criminal fingerprint identification. He was able to identify a woman who had murdered her two sons, and cut her own throat in an attempt to place blame on someone else. Her bloody print was left on a door post, proving her identity as the murderer.
In 1924, Congress established the Identification Division of the FBI. By 1946, the FBI had processed 100 million fingerprint cards in manually maintained files; and by 1971, 200 million cards. With the introduction of AFIS (Automated Fingerprint Identification System), the files were split into computerized criminal files and manually maintained civil files. Many of the manual files were duplicates, so the records actually represented about 25 million criminals, and an unknown number of people in the civil files. (info from AlladinUSA.com)
Wednesday, May 16, 2007
1911: first rear-view mirror in a car
The earliest known use and mention of a rear-view mirror is in the 1906 book The Woman and the Car which noted that women should "carry a little hand-mirror in a convenient place when driving" so they may "hold the mirror aloft from time to time in order to see behind while driving in traffic" therefore inventing the rear view mirror before it was introduced by manufacturers in 1914.
The earliest known rear-view mirror mounted on a motor vehicle was in Ray Harroun's Marmon racecar at the inaugural Indianapolis 500 race in 1911.
According to Al Binder of Ward's Auto World: As per the custom of the day, all cars except Harroun's carried riding mechanics who, among other things, helped the driver keep track of other vehicles during the race. Unable to find a mechanic to ride with him, Harroun installed a mirror on his car so he could view what was happening behind him and be alert to any cars overtaking him.
Although Harroun's use is the first known use of such a mirror on a motor vehicle, Harroun himself claimed he got the idea from seeing a mirror used for the same purpose on a horse-drawn vehicle in 1904. The invention seems to have worked — Harroun won the race, netting a $14,250 prize, equivalent to about $280,000 today.
However, the rear-view mirror had to wait for Elmer Berger, the man usually credited with inventing the rear-view mirror, to first develop them for street use. (info from Wikipedia)
The earliest known rear-view mirror mounted on a motor vehicle was in Ray Harroun's Marmon racecar at the inaugural Indianapolis 500 race in 1911.
According to Al Binder of Ward's Auto World: As per the custom of the day, all cars except Harroun's carried riding mechanics who, among other things, helped the driver keep track of other vehicles during the race. Unable to find a mechanic to ride with him, Harroun installed a mirror on his car so he could view what was happening behind him and be alert to any cars overtaking him.
Although Harroun's use is the first known use of such a mirror on a motor vehicle, Harroun himself claimed he got the idea from seeing a mirror used for the same purpose on a horse-drawn vehicle in 1904. The invention seems to have worked — Harroun won the race, netting a $14,250 prize, equivalent to about $280,000 today.
However, the rear-view mirror had to wait for Elmer Berger, the man usually credited with inventing the rear-view mirror, to first develop them for street use. (info from Wikipedia)
Tuesday, May 15, 2007
2007: Cynical Cousin Dave calls someone an idiot for the 10,000th time
1974: last Flip Wilson Show
Flip Wilson was among a group of rising black comics of the early 1970s, of such notoriety as Bill Cosby, Nipsey Russell and Dick Gregory. He is best remembered as the host of The Flip Wilson Show, the first variety show bearing the name of its African-American host, and for his role in renewing stereotype comedy.
With a keen wit developed during his impoverished youth, Wilson rose quickly to fame as a stand-up comic and television show host. Under the stagename Flip, inherited from Air Force pals who joked he was "flipped out," Wilson began performing in cheap clubs across the United States. His early routines featured black stereotypes of the controversial Amos 'n' Andy-type. After performing in hallmark black clubs such as the Apollo in Harlem and the Regal in Chicago, Wilson made a successful appearance on the Ed Sullivan Show. Recommended by Redd Foxx, Wilson also performed on The Tonight Show to great accolades, becoming a substitute host.
After making television guest appearances on such shows as Love, American Style and That's Life, and starring in his own 1969 NBC special, Wilson was offered an hour-long prime-time NBC show, The Flip Wilson Show, which saw a remarkable, four-year run. Only Sammy Davis Jr. had enjoyed similar success with his song and dance variety show; comparatively, shows hosted by Nat "King" Cole and Bill Cosby were quickly canceled, due to lack of sponsorship and narrow appeal. At the show's high point, advertising rates swelled to $86,000 per minute, and by 1972, The Flip Wilson Show was rated the most popular variety show, and the second most popular show overall in the United States.
Wilson's television success came from his unique combination of "new" stereotype comedy and his signature stand-up form. His style combined deadpan delivery and dialect borrowed from his role models, Redd Foxx and Bill Cosby, but replaced their humorous puns with storytelling. His fluid body language, likened to that of silent screen actor Charlie Chaplin, gave Wilson's act a dynamic and graceful air. The show benefited from his intensive production efforts, unprecedented for a black television performer; he wrote one third of the show's material, heavily edited the work of writers, and demanded a five-day workweek from his staff and guests to produce each one-hour segment. Audiences appreciated the show's innovative style risks, such as the intimate theater-in-the-round studio, and medium-long shots which replaced close-ups, to fully capture Wilson's expressive movements.
Wilson altered his club act for television to accomodate family viewing, relying on descriptive portraits of black characters and situations, rather than ridicule. Still, his show offended many African-Americans and civil rights activists who believed Wilson's humor depended on race. A large black and white television audience, however, found universal humor in the routines, and others credited Wilson with subtly ridiculing the art of stereotyping itself. Wilson, however, denied this claim, strongly denouncing suggestions that his race required that his art purport anti-bias messages.
These divergent interpretations in fact reflect the variety and difference among Wilson's characters. Some were easily offensive, such as the money laundering Reverend Leroy and the smooth swinger, Freddy the Playboy. Others, such as Sonny, White House janitor and the "wisest man in Washington", were positive black portraits. The show's most popular character, Geraldine, exemplifies Wilson's intention to produce race-free comedy. Perfectly coifed and decked out in designer clothes and chartreuse stockings, Geraldine demanded respect and, in Wilson's words, "Everybody knows she don't take no stuff." Liberated, yet married, outspoken, yet feminine, ghetto-born yet poised, Geraldine was neither floozy nor threat. This colorful black female image struck a positive chord with viewers; her one-liners -- "The devil made me do it," and "When you're hot, you're hot" -- became national fads.
Social messages were imparted indirectly through Wilson's characters; Geraldine, for example, countered the female-degrading acts of other popular stand-up comics. Through Geraldine, Wilson also negotiated race and class bias by positively characterizing a working class black female, in contrast to the absence of female black images on 1970s television, with the exception of the middle-class black nurse of the 1969 sitcom Julia.
Wilson's humor was at the same time insightful, self-effacing and often intellectual. One of his best jokes (the retelling of a very old joke) was relayed as follows: "Lots of crazy things happen in traveling. Just last week I was on a train. There was a woman traveling with a baby. UGLY baby! I mean, I'm not one to make comments about anyone's kid -- but this was an UGLY baby. A guy walks down the train -- he's half smashed -- and he stops. And he stares. And the lady says "What are you looking at?" The guy says "I'm looking at that ugly baby." A scene ensues, whereupon the conductor arrives. He says "What's going on here?" The woman says "This man just insulted me!" The conductor says "Now calm down Madam, calm down. We here at the railroad want to make sure that there are no altercations between our passengers and that everyone's trip is as relaxing as possible. Accordingly, if you allow us, please step into the dining car and the railroad will buy you a free meal. And maybe we can find a banana for your monkey."
In his stand-up comedy routine "Columbus," Wilson re-tells the story of Christopher Columbus from a slightly 'urban' perspective, with Columbus finally convincing the Spanish monarchs to fund his voyage by noting that discovering America means that he can thus also discover Ray Charles. Hearing this, Queen Isabella, sounding not unlike "Geraldine," says that "Chris" can have "all the money you want, Honey -- You go find Ray Charles!!" When Columbus departs from the dock, Isabella is there, testifying to one and all that "Chris gonna find Ray Charles!!"
Wilson's career lost momentum when his show was canceled in 1974. Though the recipient of a 1970 Emmy Award for outstanding writing and a 1971 Grammy for best comedy record, Wilson's career never rekindled. He continued to make television specials, and TV guest appearances, debuted in Sydney Poitier's successful, post-blaxploitation film, Uptown Saturday Night, and performed in two subsequent unsuccessful films. His 1985 television comeback, Charlie and Company -- a sitcom following The Cosby Show's formula--had a short run.
Wilson saw himself first as an artist, hence, humor was more prominent than politics in his comic routines. This style, however, allowed him to successfully impart occassional social messages into his act. Moreover, he achieved unprecedented artistic control of his show, pressing the parameters for black television perfomers and producers. Through Geraldine, Wilson created one of a few respectful television images for black woman, who were generally marginalized by both the civil rights and women's movements of that era. Finally, though no regular black variety show took up where Wilson left off, its success paved the way for the popularity of later sitcoms featuring middle- and working-class black families, situations, and dialect, shows such as Sanford and Son, The Jeffersons and Good Times. Wilson died of cancer at the age of 64, in 1998.(Info from The Museum of Broadcast Communications, and Wikipedia)
With a keen wit developed during his impoverished youth, Wilson rose quickly to fame as a stand-up comic and television show host. Under the stagename Flip, inherited from Air Force pals who joked he was "flipped out," Wilson began performing in cheap clubs across the United States. His early routines featured black stereotypes of the controversial Amos 'n' Andy-type. After performing in hallmark black clubs such as the Apollo in Harlem and the Regal in Chicago, Wilson made a successful appearance on the Ed Sullivan Show. Recommended by Redd Foxx, Wilson also performed on The Tonight Show to great accolades, becoming a substitute host.
After making television guest appearances on such shows as Love, American Style and That's Life, and starring in his own 1969 NBC special, Wilson was offered an hour-long prime-time NBC show, The Flip Wilson Show, which saw a remarkable, four-year run. Only Sammy Davis Jr. had enjoyed similar success with his song and dance variety show; comparatively, shows hosted by Nat "King" Cole and Bill Cosby were quickly canceled, due to lack of sponsorship and narrow appeal. At the show's high point, advertising rates swelled to $86,000 per minute, and by 1972, The Flip Wilson Show was rated the most popular variety show, and the second most popular show overall in the United States.
Wilson's television success came from his unique combination of "new" stereotype comedy and his signature stand-up form. His style combined deadpan delivery and dialect borrowed from his role models, Redd Foxx and Bill Cosby, but replaced their humorous puns with storytelling. His fluid body language, likened to that of silent screen actor Charlie Chaplin, gave Wilson's act a dynamic and graceful air. The show benefited from his intensive production efforts, unprecedented for a black television performer; he wrote one third of the show's material, heavily edited the work of writers, and demanded a five-day workweek from his staff and guests to produce each one-hour segment. Audiences appreciated the show's innovative style risks, such as the intimate theater-in-the-round studio, and medium-long shots which replaced close-ups, to fully capture Wilson's expressive movements.
Wilson altered his club act for television to accomodate family viewing, relying on descriptive portraits of black characters and situations, rather than ridicule. Still, his show offended many African-Americans and civil rights activists who believed Wilson's humor depended on race. A large black and white television audience, however, found universal humor in the routines, and others credited Wilson with subtly ridiculing the art of stereotyping itself. Wilson, however, denied this claim, strongly denouncing suggestions that his race required that his art purport anti-bias messages.
These divergent interpretations in fact reflect the variety and difference among Wilson's characters. Some were easily offensive, such as the money laundering Reverend Leroy and the smooth swinger, Freddy the Playboy. Others, such as Sonny, White House janitor and the "wisest man in Washington", were positive black portraits. The show's most popular character, Geraldine, exemplifies Wilson's intention to produce race-free comedy. Perfectly coifed and decked out in designer clothes and chartreuse stockings, Geraldine demanded respect and, in Wilson's words, "Everybody knows she don't take no stuff." Liberated, yet married, outspoken, yet feminine, ghetto-born yet poised, Geraldine was neither floozy nor threat. This colorful black female image struck a positive chord with viewers; her one-liners -- "The devil made me do it," and "When you're hot, you're hot" -- became national fads.
Social messages were imparted indirectly through Wilson's characters; Geraldine, for example, countered the female-degrading acts of other popular stand-up comics. Through Geraldine, Wilson also negotiated race and class bias by positively characterizing a working class black female, in contrast to the absence of female black images on 1970s television, with the exception of the middle-class black nurse of the 1969 sitcom Julia.
Wilson's humor was at the same time insightful, self-effacing and often intellectual. One of his best jokes (the retelling of a very old joke) was relayed as follows: "Lots of crazy things happen in traveling. Just last week I was on a train. There was a woman traveling with a baby. UGLY baby! I mean, I'm not one to make comments about anyone's kid -- but this was an UGLY baby. A guy walks down the train -- he's half smashed -- and he stops. And he stares. And the lady says "What are you looking at?" The guy says "I'm looking at that ugly baby." A scene ensues, whereupon the conductor arrives. He says "What's going on here?" The woman says "This man just insulted me!" The conductor says "Now calm down Madam, calm down. We here at the railroad want to make sure that there are no altercations between our passengers and that everyone's trip is as relaxing as possible. Accordingly, if you allow us, please step into the dining car and the railroad will buy you a free meal. And maybe we can find a banana for your monkey."
In his stand-up comedy routine "Columbus," Wilson re-tells the story of Christopher Columbus from a slightly 'urban' perspective, with Columbus finally convincing the Spanish monarchs to fund his voyage by noting that discovering America means that he can thus also discover Ray Charles. Hearing this, Queen Isabella, sounding not unlike "Geraldine," says that "Chris" can have "all the money you want, Honey -- You go find Ray Charles!!" When Columbus departs from the dock, Isabella is there, testifying to one and all that "Chris gonna find Ray Charles!!"
Wilson's career lost momentum when his show was canceled in 1974. Though the recipient of a 1970 Emmy Award for outstanding writing and a 1971 Grammy for best comedy record, Wilson's career never rekindled. He continued to make television specials, and TV guest appearances, debuted in Sydney Poitier's successful, post-blaxploitation film, Uptown Saturday Night, and performed in two subsequent unsuccessful films. His 1985 television comeback, Charlie and Company -- a sitcom following The Cosby Show's formula--had a short run.
Wilson saw himself first as an artist, hence, humor was more prominent than politics in his comic routines. This style, however, allowed him to successfully impart occassional social messages into his act. Moreover, he achieved unprecedented artistic control of his show, pressing the parameters for black television perfomers and producers. Through Geraldine, Wilson created one of a few respectful television images for black woman, who were generally marginalized by both the civil rights and women's movements of that era. Finally, though no regular black variety show took up where Wilson left off, its success paved the way for the popularity of later sitcoms featuring middle- and working-class black families, situations, and dialect, shows such as Sanford and Son, The Jeffersons and Good Times. Wilson died of cancer at the age of 64, in 1998.(Info from The Museum of Broadcast Communications, and Wikipedia)
Monday, May 14, 2007
1998: Daimler buys Chrysler
2007: Daimler sells Chrysler
DaimlerChrysler announced today a deal to sell a controlling stake in Chrysler Group, Detroit's No. 3 auto maker, to private-equity firm Cerberus Capital Management. Daimler paid $36 billion to buy Chrysler nine years ago, and will sell about 80% of it for $7.4 billion.
The deal, in which DaimlerChrysler is effectively paying to dispose of most of Chrysler, is also a watershed in the industry, marking the first time a private-equity company has acquired one of the world's biggest auto makers.
In Greek mythology, Cerberus was a monstrous three-headed dog with a snake for a tail. Cerberus guarded the gate to hell and ensured that spirits of the dead could not leave.
DaimlerChrysler said that an affiliate of Cerberus (the company, not the monster) will acquire 80.1% in the new Chrysler Holding LLC, while DaimlerChrysler will keep a 19.9% stake. It said that obligations for pensions and health-care costs would be retained by the Chrysler companies.
The affiliate will make a capital contribution of $7.4 billion in return for the 80.1% equity interest in the future new company. Due to the new corporate structure, the name of DaimlerChrysler AG is to be changed to Daimler AG.
DaimlerChrysler said the deal is still subject to the approval of its supervisory board, but has been backed by the United Autoworkers Union. The closing of the transaction is expected to take place in the third quarter of 2007.
The news sent DaimlerChrysler shares up 5.4% in morning trade in Frankfurt. Shares extended a nearly 30% rise going into Monday on anticipation of such a deal.
A private-equity takeover of Chrysler marks a watershed for the industry, which is struggling under the weight of massive pension and health-care obligations to its union workers. Those debts and the cash required to fund them have hobbled General Motors Corp., Ford Motor Co. and Chrysler in the face of relentless competition from Asian and European rivals. Chrysler has estimated that Japanese auto makers like Toyota Motor Corp. enjoy a labor-cost advantage of as much as $30 an hour.
Cerberus has a record of slashing costs at operations it acquires, and some analysts say a Cerberus-owned Chrysler could move much more aggressively to cut labor costs, prune Chrysler's crowded dealer network in the U.S. and shift investment to developing markets overseas. But any final deal for Chrysler also hinges on what happens this summer, when the United Auto Workers kicks off negotiations for new contracts with all three Detroit auto makers.
The proposed sale would mark the end of DaimlerChrysler's turbulent nine-year effort to make a success of an ambitious global expansion strategy pushed by former Chairman and Chief Executive Officer Jürgen Schrempp. Mr. Schrempp rocked the auto industry in 1998 with his deal to buy Chrysler, which at the time was at a high point in its profit and product cycle.
Billed at the time as a "merger of equals," the deal never delivered the synergies that Mr. Schrempp and his successor, Dieter Zetsche, argued it could. Instead, Chrysler's performance seesawed from losses to profits to, most recently, losses again. For 2006, Chrysler Group had a loss of $633.3 million on revenue of $63.6 billion. (info from The Wall Street Journal and Wikipedia)
The deal, in which DaimlerChrysler is effectively paying to dispose of most of Chrysler, is also a watershed in the industry, marking the first time a private-equity company has acquired one of the world's biggest auto makers.
In Greek mythology, Cerberus was a monstrous three-headed dog with a snake for a tail. Cerberus guarded the gate to hell and ensured that spirits of the dead could not leave.
DaimlerChrysler said that an affiliate of Cerberus (the company, not the monster) will acquire 80.1% in the new Chrysler Holding LLC, while DaimlerChrysler will keep a 19.9% stake. It said that obligations for pensions and health-care costs would be retained by the Chrysler companies.
The affiliate will make a capital contribution of $7.4 billion in return for the 80.1% equity interest in the future new company. Due to the new corporate structure, the name of DaimlerChrysler AG is to be changed to Daimler AG.
DaimlerChrysler said the deal is still subject to the approval of its supervisory board, but has been backed by the United Autoworkers Union. The closing of the transaction is expected to take place in the third quarter of 2007.
The news sent DaimlerChrysler shares up 5.4% in morning trade in Frankfurt. Shares extended a nearly 30% rise going into Monday on anticipation of such a deal.
A private-equity takeover of Chrysler marks a watershed for the industry, which is struggling under the weight of massive pension and health-care obligations to its union workers. Those debts and the cash required to fund them have hobbled General Motors Corp., Ford Motor Co. and Chrysler in the face of relentless competition from Asian and European rivals. Chrysler has estimated that Japanese auto makers like Toyota Motor Corp. enjoy a labor-cost advantage of as much as $30 an hour.
Cerberus has a record of slashing costs at operations it acquires, and some analysts say a Cerberus-owned Chrysler could move much more aggressively to cut labor costs, prune Chrysler's crowded dealer network in the U.S. and shift investment to developing markets overseas. But any final deal for Chrysler also hinges on what happens this summer, when the United Auto Workers kicks off negotiations for new contracts with all three Detroit auto makers.
The proposed sale would mark the end of DaimlerChrysler's turbulent nine-year effort to make a success of an ambitious global expansion strategy pushed by former Chairman and Chief Executive Officer Jürgen Schrempp. Mr. Schrempp rocked the auto industry in 1998 with his deal to buy Chrysler, which at the time was at a high point in its profit and product cycle.
Billed at the time as a "merger of equals," the deal never delivered the synergies that Mr. Schrempp and his successor, Dieter Zetsche, argued it could. Instead, Chrysler's performance seesawed from losses to profits to, most recently, losses again. For 2006, Chrysler Group had a loss of $633.3 million on revenue of $63.6 billion. (info from The Wall Street Journal and Wikipedia)
Friday, May 11, 2007
1969: first computer with a floppy disc drive
2009?: last computer with a floppy disc drive
In 1967, IBM wanted to develop a simple and inexpensive system for loading microcode into their System/370 mainframe computers. The 370 was the first IBM computer to use semiconductor memory, and whenever the power was turned off the microcode had to be reloaded. Normally this task would be done with tape drives which almost all 370 systems included, but tapes were large and slow. IBM wanted something faster and and lighter that could also be sent out to customers with software updates for $5.
An IBM team developed a read-only, 8-inch diameter flexible "floppy" disk they called the "memory disk", holding 80 kilobytes. The new device became a standard part of 370 systems starting in 1969.
In 1973 IBM released a new version of the floppy that stored up to 250¼ kB on the same disks, and was read-write. These drives became common, and soon were being used to move smaller amounts of data around, almost completely replacing magnetic tapes. The IBM standard soft-sectored disk format was designed to hold just as much data as one box of punch cards.
When the first microcomputers were being developed in the 1970s, the 8-inch floppy found a place on them as one of the few "high speed, mass storage" devices that were even remotely affordable to the target market (individuals and small businesses). The first microcomputer operating system, CP/M, originally shipped on 8-inch disks. However, the drives were still expensive, typically costing more than the computer they were attached to in early days, so most machines of the era used cassette tape instead.
This began to change with the acceptance of the first standard for the floppy disk, and standardization brought together competitors to make media to a single interchangeable standard, and allowed rapid quality and cost improvement.
In 1976 two of Shugart Associates's employees were approached by An Wang of Wang Laboratories, who felt that the 8-inch format was simply too large for the desktop word processing machines he was developing at the time. After meeting in a bar in Boston, Adkisson asked Wang what size he thought the disks should be, and Wang pointed to a napkin and said "about that size". Adkisson took the napkin back to California, found it to be 5¼-inches (13 cm) wide, and developed a new drive of this size storing 98.5 KB later increased to 110 KB by adding 5 tracks. This is believed to be the first standard computer medium that was not promulgated by IBM.
The 5¼-inch drive was considerably less expensive than 8-inch drives from IBM, and soon started appearing on CP/M machines. At one point Shugart was producing 4,000 drives a day. By 1978 there were more than 10 manufacturers producing 5¼-inch floppy drives, in competing physical disk formats: hard-sectored (90 KB) and soft-sectored (110 KB). The 5¼-inch formats quickly displaced the 8-inch from most applications.
These early drives read only one side of the disk, leading to the popular budget approach of cutting a second write-enable slot and index hole into the carrier envelope and flipping it over (thus, the "flippy disk") to use the other side for additional storage. This was considered risky by some, the reasoning being that when flipped the disk would spin in the opposite direction inside its cover, so some of the dirt that had been collected by the fabric lining in the previous rotations would be picked up by the disk and dragged past the read/write head.[citation needed] In reality, since some single-head floppy drives had their read/write heads on the bottom and some had them on the top, disk manufacturers routinely certified both sides of disks for use, thus the method was perfectly safe.
Floppy disk write protect tabs. These sticky paper tabs are folded over the notch in the side of a 5¼-inch disk to prevent the computer from writing data to the disk. Later disks, such as the 3½-inch disk, had a built-in slideable plastic tab to implement write-protection.Tandon introduced a double-sided drive in 1978, doubling the capacity, and a new "double density" format increased it again, to 360 KB.
For most of the 1970s and 1980s the floppy drive was the primary storage device for microcomputers. Since these micros had no hard drive, the OS was usually booted from one floppy disk, which was then removed and replaced by another one containing the application. Some machines using two disk drives (or one dual drive) allowed the user to leave the OS disk in place and simply change the application disks as needed. In the early 1980s, "quad density" 96 track-per-inch drives appeared, increasing the capacity to 720 KB.
Despite the available capacity of the disks, support on the most popular operating system of the early 80s—PC-DOS and MS-DOS—lagged slightly behind. In fact, the original IBM PC did not include a floppy drive at all as standard equipment—you could either buy the optional 5¼-inch floppy drive or rely upon the cassette port. With version 1.0 of DOS (1981) only single sided 160 KB floppies were supported. Version 1.1 the next year saw support expand to double-sided, 320 KB disks. Finally in 1983 DOS 2.0 supported 9 sectors per track rather than 8, providing 180 KB on a (formatted) single-sided disk and 360 KB on a double-sided. Along with this change came support for different directories on the disk (now commonly called folders), which came in handy when organizing the greater number of files possible in this increased space.
In 1984, along with the IBM PC/AT, the high density disk appeared, which used 96 tracks per inch combined with a higher density magnetic media to provide 1,200 KB of storage. Since the usual (very expensive) hard disk held 10–20 megabytes at the time, this was considered quite spacious.
By the end of the 1980s, the 5¼-inch disks had been superseded by the 3½-inch disks. Though 5¼-inch drives were still available, as were disks, they faded in popularity as the 1990s began. The main community of users was primarily those who still owned '80s legacy machines (PCs running MS-DOS or home computers) that had no 3½-inch drive; the advent of Windows 95 (not even sold in stores in a 5¼-inch version; a coupon had to be obtained and mailed in) and subsequent phaseout of standalone MS-DOS with version 6.22 forced many of them to upgrade their hardware. On most new computers the 5¼-inch drives were optional equipment.
Throughout the early 1980s the limitations of the 5¼-inch format were starting to become clear. Originally designed to be a smaller and more practical 8-inch, the 5¼-inch system was itself too large, and as the quality of the recording media grew, the same amount of data could be placed on a smaller surface. Another problem was that the 5¼-inch disks were simply copies of the 8-inch physical format, which had never really been engineered for ease of use. The thin folded-plastic shell allowed the disk to be easily damaged through bending, and allowed dirt to get onto the disk.
A number of solutions were developed by various companies. They all shared a number of advantages over the older format, including a small form factor and a rigid case with a slideable write protect catch. The almost-universal use of the 5¼-inch format made it very difficult for any of these new formats to gain any significant market share until 1984 when Sony convinced Apple to use the 3½-inch drives in the Macintosh 128K model, effectively making the 3½-inch drive a de-facto standard.
The 3½-inch disks had, by way of their rigid case's slide-in-place metal cover, the significant advantage of being much better protected against unintended physical contact with the disk surface than 5¼-inch disks when the disk was handled outside the disk drive. The irregular, rectangular shape had the additional merit that it made it impossible to insert the disk sideways by mistake as had indeed been possible with earlier formats.
Like the 5¼-inch, the 3½-inch disk underwent an evolution of its own. When Apple introduced the Macintosh in 1984, it used single-sided 3½-inch disk drives with an advertised capacity of 400 kB. A newer and better "high-density" (HD) format was introduced in 1987.
Another advance in the oxide coatings allowed for a new "extended-density" ("ED") format at 2880 kB introduced on the second generation NeXT Computers in 1991, and on IBM PS/2 model 57 also in 1991, but by the time it was available it was already too small to be a useful advance over the HD format and never became widely used. The 3½-inch drives sold more than a decade later still use the same 1.44 MB HD format that was standardized in 1989.
Through the early 1990s a number of attempts were made by various companies to introduce newer floppy-like formats based on the now-universal 3½-inch physical format. Most of these systems provided the ability to read and write standard DD and HD disks, while at the same time introducing a much higher-capacity format as well. There were a number of times where it was felt that the existing floppy was just about to be replaced by one of these newer devices, but a variety of problems ensured this never took place. None of these ever reached the point where it could be assumed that every current PC would have one, and they have now largely been replaced by CD and DVD burners and USB flash drives.
Apple dropped floppy drives about five years ago. Dell has made the floppy drive an option, not a standard. Other PC makers have dropped the flop, too. (info from Wikipedia and PC World, pictures from oldcomputers.net)
An IBM team developed a read-only, 8-inch diameter flexible "floppy" disk they called the "memory disk", holding 80 kilobytes. The new device became a standard part of 370 systems starting in 1969.
In 1973 IBM released a new version of the floppy that stored up to 250¼ kB on the same disks, and was read-write. These drives became common, and soon were being used to move smaller amounts of data around, almost completely replacing magnetic tapes. The IBM standard soft-sectored disk format was designed to hold just as much data as one box of punch cards.
When the first microcomputers were being developed in the 1970s, the 8-inch floppy found a place on them as one of the few "high speed, mass storage" devices that were even remotely affordable to the target market (individuals and small businesses). The first microcomputer operating system, CP/M, originally shipped on 8-inch disks. However, the drives were still expensive, typically costing more than the computer they were attached to in early days, so most machines of the era used cassette tape instead.
This began to change with the acceptance of the first standard for the floppy disk, and standardization brought together competitors to make media to a single interchangeable standard, and allowed rapid quality and cost improvement.
In 1976 two of Shugart Associates's employees were approached by An Wang of Wang Laboratories, who felt that the 8-inch format was simply too large for the desktop word processing machines he was developing at the time. After meeting in a bar in Boston, Adkisson asked Wang what size he thought the disks should be, and Wang pointed to a napkin and said "about that size". Adkisson took the napkin back to California, found it to be 5¼-inches (13 cm) wide, and developed a new drive of this size storing 98.5 KB later increased to 110 KB by adding 5 tracks. This is believed to be the first standard computer medium that was not promulgated by IBM.
The 5¼-inch drive was considerably less expensive than 8-inch drives from IBM, and soon started appearing on CP/M machines. At one point Shugart was producing 4,000 drives a day. By 1978 there were more than 10 manufacturers producing 5¼-inch floppy drives, in competing physical disk formats: hard-sectored (90 KB) and soft-sectored (110 KB). The 5¼-inch formats quickly displaced the 8-inch from most applications.
These early drives read only one side of the disk, leading to the popular budget approach of cutting a second write-enable slot and index hole into the carrier envelope and flipping it over (thus, the "flippy disk") to use the other side for additional storage. This was considered risky by some, the reasoning being that when flipped the disk would spin in the opposite direction inside its cover, so some of the dirt that had been collected by the fabric lining in the previous rotations would be picked up by the disk and dragged past the read/write head.[citation needed] In reality, since some single-head floppy drives had their read/write heads on the bottom and some had them on the top, disk manufacturers routinely certified both sides of disks for use, thus the method was perfectly safe.
Floppy disk write protect tabs. These sticky paper tabs are folded over the notch in the side of a 5¼-inch disk to prevent the computer from writing data to the disk. Later disks, such as the 3½-inch disk, had a built-in slideable plastic tab to implement write-protection.Tandon introduced a double-sided drive in 1978, doubling the capacity, and a new "double density" format increased it again, to 360 KB.
For most of the 1970s and 1980s the floppy drive was the primary storage device for microcomputers. Since these micros had no hard drive, the OS was usually booted from one floppy disk, which was then removed and replaced by another one containing the application. Some machines using two disk drives (or one dual drive) allowed the user to leave the OS disk in place and simply change the application disks as needed. In the early 1980s, "quad density" 96 track-per-inch drives appeared, increasing the capacity to 720 KB.
Despite the available capacity of the disks, support on the most popular operating system of the early 80s—PC-DOS and MS-DOS—lagged slightly behind. In fact, the original IBM PC did not include a floppy drive at all as standard equipment—you could either buy the optional 5¼-inch floppy drive or rely upon the cassette port. With version 1.0 of DOS (1981) only single sided 160 KB floppies were supported. Version 1.1 the next year saw support expand to double-sided, 320 KB disks. Finally in 1983 DOS 2.0 supported 9 sectors per track rather than 8, providing 180 KB on a (formatted) single-sided disk and 360 KB on a double-sided. Along with this change came support for different directories on the disk (now commonly called folders), which came in handy when organizing the greater number of files possible in this increased space.
In 1984, along with the IBM PC/AT, the high density disk appeared, which used 96 tracks per inch combined with a higher density magnetic media to provide 1,200 KB of storage. Since the usual (very expensive) hard disk held 10–20 megabytes at the time, this was considered quite spacious.
By the end of the 1980s, the 5¼-inch disks had been superseded by the 3½-inch disks. Though 5¼-inch drives were still available, as were disks, they faded in popularity as the 1990s began. The main community of users was primarily those who still owned '80s legacy machines (PCs running MS-DOS or home computers) that had no 3½-inch drive; the advent of Windows 95 (not even sold in stores in a 5¼-inch version; a coupon had to be obtained and mailed in) and subsequent phaseout of standalone MS-DOS with version 6.22 forced many of them to upgrade their hardware. On most new computers the 5¼-inch drives were optional equipment.
Throughout the early 1980s the limitations of the 5¼-inch format were starting to become clear. Originally designed to be a smaller and more practical 8-inch, the 5¼-inch system was itself too large, and as the quality of the recording media grew, the same amount of data could be placed on a smaller surface. Another problem was that the 5¼-inch disks were simply copies of the 8-inch physical format, which had never really been engineered for ease of use. The thin folded-plastic shell allowed the disk to be easily damaged through bending, and allowed dirt to get onto the disk.
A number of solutions were developed by various companies. They all shared a number of advantages over the older format, including a small form factor and a rigid case with a slideable write protect catch. The almost-universal use of the 5¼-inch format made it very difficult for any of these new formats to gain any significant market share until 1984 when Sony convinced Apple to use the 3½-inch drives in the Macintosh 128K model, effectively making the 3½-inch drive a de-facto standard.
The 3½-inch disks had, by way of their rigid case's slide-in-place metal cover, the significant advantage of being much better protected against unintended physical contact with the disk surface than 5¼-inch disks when the disk was handled outside the disk drive. The irregular, rectangular shape had the additional merit that it made it impossible to insert the disk sideways by mistake as had indeed been possible with earlier formats.
Like the 5¼-inch, the 3½-inch disk underwent an evolution of its own. When Apple introduced the Macintosh in 1984, it used single-sided 3½-inch disk drives with an advertised capacity of 400 kB. A newer and better "high-density" (HD) format was introduced in 1987.
Another advance in the oxide coatings allowed for a new "extended-density" ("ED") format at 2880 kB introduced on the second generation NeXT Computers in 1991, and on IBM PS/2 model 57 also in 1991, but by the time it was available it was already too small to be a useful advance over the HD format and never became widely used. The 3½-inch drives sold more than a decade later still use the same 1.44 MB HD format that was standardized in 1989.
Through the early 1990s a number of attempts were made by various companies to introduce newer floppy-like formats based on the now-universal 3½-inch physical format. Most of these systems provided the ability to read and write standard DD and HD disks, while at the same time introducing a much higher-capacity format as well. There were a number of times where it was felt that the existing floppy was just about to be replaced by one of these newer devices, but a variety of problems ensured this never took place. None of these ever reached the point where it could be assumed that every current PC would have one, and they have now largely been replaced by CD and DVD burners and USB flash drives.
Apple dropped floppy drives about five years ago. Dell has made the floppy drive an option, not a standard. Other PC makers have dropped the flop, too. (info from Wikipedia and PC World, pictures from oldcomputers.net)
Thursday, May 10, 2007
1927: garbage disposer invented
2006: disposer maker sues NBC
The garbage disposer (or disposal) was invented in 1927 by John W. Hammes, an architect in Racine, Wisconsin. After eleven years of development, his In-Sink-Erator company put the invention on the market.
Many US cities and towns had regulations against putting food waste (garbage) into the sewer system. In-Sink-Erator spent considerable effort, and convinced many localities to rescind the prohibitions. Eventually, many localities even mandated the use of disposals. New York City banned their use until 1997.
The device first became widely popular in upscale American kitchens of the 1970s and 1980s, yet remains very rare in European countries, due in part to greater promotion of composting kitchen waste and regulations on sewage disposal.
A garbarge disposal is featured in 1981's The Incredible Shrinking Woman when Lily Tomlin's character "Pat Kramer" falls down the drain and is almost chopped to bits by her family.
The character of Claire Bennet on the NBC TV series Heroes severely (and intentionally) injures her hand in an active garbage disposal unit in the series pilot aired in 2006. This led to a lawsuit by InSinkErator, the manufacturer of the unit.
Claire is a cheerleader with the power of indestructibility, who sticks her hand into a garbage disposal and gets it mangled. A few seconds later her hand heals.
In the lawsuit, the appliance maker claims that NBC used the InSinkErator trademark without the company's consent, that the show "implies an incorrect and dangerous design for a food waste disposer" and that NBC's depiction of the InSinkErator "casts the disposer in an unsavory light, irreparably tarnishing the product."
Dan Callahan, spokesman for InSinkErator parent Emerson Electric, said the company, of course, does not recommend anybody put their hands in a garbage disposal that is turned on. He also pointed out that, according to data from the Consumer Products Safety Commission, you are actually ten times more likely to get injured by your dishwasher than your garbage disposal.
An NBC spokesperson stated, "While we do not believe there is any legal issue with the episode as originally broadcast, we nonetheless have decided to edit the episode for future uses." NBC is owned by GE, which also makes garbage disposers. Apparently they did not want to show their own brand chomping on human body parts. (info from Wikipedia and CNNMoney.com)
Many US cities and towns had regulations against putting food waste (garbage) into the sewer system. In-Sink-Erator spent considerable effort, and convinced many localities to rescind the prohibitions. Eventually, many localities even mandated the use of disposals. New York City banned their use until 1997.
The device first became widely popular in upscale American kitchens of the 1970s and 1980s, yet remains very rare in European countries, due in part to greater promotion of composting kitchen waste and regulations on sewage disposal.
A garbarge disposal is featured in 1981's The Incredible Shrinking Woman when Lily Tomlin's character "Pat Kramer" falls down the drain and is almost chopped to bits by her family.
The character of Claire Bennet on the NBC TV series Heroes severely (and intentionally) injures her hand in an active garbage disposal unit in the series pilot aired in 2006. This led to a lawsuit by InSinkErator, the manufacturer of the unit.
Claire is a cheerleader with the power of indestructibility, who sticks her hand into a garbage disposal and gets it mangled. A few seconds later her hand heals.
In the lawsuit, the appliance maker claims that NBC used the InSinkErator trademark without the company's consent, that the show "implies an incorrect and dangerous design for a food waste disposer" and that NBC's depiction of the InSinkErator "casts the disposer in an unsavory light, irreparably tarnishing the product."
Dan Callahan, spokesman for InSinkErator parent Emerson Electric, said the company, of course, does not recommend anybody put their hands in a garbage disposal that is turned on. He also pointed out that, according to data from the Consumer Products Safety Commission, you are actually ten times more likely to get injured by your dishwasher than your garbage disposal.
An NBC spokesperson stated, "While we do not believe there is any legal issue with the episode as originally broadcast, we nonetheless have decided to edit the episode for future uses." NBC is owned by GE, which also makes garbage disposers. Apparently they did not want to show their own brand chomping on human body parts. (info from Wikipedia and CNNMoney.com)
Wednesday, May 9, 2007
2001: last Plymouth
The final Plymouth rolled off a Chrysler assembly line in 2001. It was a silver Neon LX with a manual transmission. It was fitting that the last Plymouth was an economy car, because Walter P. Chrysler created the brand to attract entry-level customers.
The Plymouth was introduced on July 7, 1928. It was Chrysler Corporation's first entry in the low-priced field, then dominated by Chevrolet and Ford. Plymouths were actually priced a little higher than the competition, but offered standard features such as hydraulic brakes that the competition did not provide. Plymouths were originally sold exclusively through Chrysler dealerships.
When Walter Chrysler took over control of the trouble-ridden Maxwell-Chalmers car company in the early 1920s, he inherited the Maxwell as part of the package. After he used the company's facilities to help create and launch the Chrysler car in 1924, he decided to create a lower-priced companion car. So for 1926 the Maxwell was reworked and rebadged as a low-end Chrysler model. At the end of the decade this model was once again reworked and rebadged to create the Plymouth.
While the original purpose of the Plymouth was simply to cover a lower-end marketing niche, during the Great Depression of the 1930s the car would help significantly in ensuring the survival of the Chrysler Corporation when many other car companies failed. Beginning in 1930, Plymouths were sold by all three Chrysler divisions (Chrysler, DeSoto, and Dodge). Plymouth sales were a bright spot during this dismal automotive period, and by 1931 Plymouth rose to the number three spot among all cars.
For much of its life, Plymouth was one of the top selling American automobile brands, along with Chevrolet and Ford ("the low-priced three"). Plymouth even surpassed Ford for a time in the 1940s as the second most popular make of automobiles in the US. Through 1956, Plymouths were known for their durability, affordability and engineering. In 1957, Chrysler's Forward Look styling theme produced cars with much more advanced styling than Chevrolet or Ford, although Plymouth's reputation would ultimately suffer as the cars were prone to rust and sloppy assembly. The marque also introduced its limited production Fury line in 1956, and it too benefited from the crisp Forward Look designs.
The Plymouth brand lost market share rapidly in the early 1960s. While Plymouth was a styling leader from 1957 to 1958, its 1959 through 1962 models were awkwardly styled cars. Plymouth also competed with corporate sister division Dodge when the lower-priced, full-size Dodge Dart was introduced for 1960. Rambler, and then Pontiac would assume the number three sales position for the remainder of the decade. Plymouth went into a decline and never fully recovered.
The marque regained market share with the 1965 models, which returned Plymouth to full-size vehicles and more mainstream styling. Plymouth regained its traditional third place in the sales race in 1971 and 1974, primarily with its popular Valiant and Duster compact models; but Plymouth was hurt by Chrysler's financial woes of the late 1970s. Marketing decisions reduced the Plymouth lineup to the point that it was no longer a full-line brand. New models were increasingly given to the Dodge and Chrysler brands, but denied to Plymouth. By 1979, its lineup consisted of only the domestically produced Volare and Horizon models, and a number of rebadged Mitsubishi imports.
In the late 1990s, four vehicles were sold under the Plymouth name: the Voyager/Grand Voyager minivan, the Breeze mid-size sedan, the Neon compact car, and the Prowler sports car.
After discontinuing the Eagle brand in 1998, Chrysler was planning to expand the Plymouth line with a number of unique models before the corporation's merger with Daimler-Benz. The first model was the Plymouth Prowler, a modern hot rod. The PT Cruiser was to have been the second. Other than the Prowler, at the time of the takeover Plymouth had no unique products.
Furthermore, while all Plymouth dealers also sold Chryslers, many Dodge dealers sold only Dodge; thus it would cause much more dealer disarray to discontinue Dodge than it would to discontinue Plymouth. Consequently, DaimlerChrysler decided to drop the make after a limited run of 2001 models
The last new model sold under the Plymouth marque was the second generation Neon. The PT Cruiser was ultimately launched as a Chrysler, and the Prowler and Voyager were absorbed into that brand as well. (info from St. Petersburg Times and Wikipedia)
The Plymouth was introduced on July 7, 1928. It was Chrysler Corporation's first entry in the low-priced field, then dominated by Chevrolet and Ford. Plymouths were actually priced a little higher than the competition, but offered standard features such as hydraulic brakes that the competition did not provide. Plymouths were originally sold exclusively through Chrysler dealerships.
When Walter Chrysler took over control of the trouble-ridden Maxwell-Chalmers car company in the early 1920s, he inherited the Maxwell as part of the package. After he used the company's facilities to help create and launch the Chrysler car in 1924, he decided to create a lower-priced companion car. So for 1926 the Maxwell was reworked and rebadged as a low-end Chrysler model. At the end of the decade this model was once again reworked and rebadged to create the Plymouth.
While the original purpose of the Plymouth was simply to cover a lower-end marketing niche, during the Great Depression of the 1930s the car would help significantly in ensuring the survival of the Chrysler Corporation when many other car companies failed. Beginning in 1930, Plymouths were sold by all three Chrysler divisions (Chrysler, DeSoto, and Dodge). Plymouth sales were a bright spot during this dismal automotive period, and by 1931 Plymouth rose to the number three spot among all cars.
For much of its life, Plymouth was one of the top selling American automobile brands, along with Chevrolet and Ford ("the low-priced three"). Plymouth even surpassed Ford for a time in the 1940s as the second most popular make of automobiles in the US. Through 1956, Plymouths were known for their durability, affordability and engineering. In 1957, Chrysler's Forward Look styling theme produced cars with much more advanced styling than Chevrolet or Ford, although Plymouth's reputation would ultimately suffer as the cars were prone to rust and sloppy assembly. The marque also introduced its limited production Fury line in 1956, and it too benefited from the crisp Forward Look designs.
The Plymouth brand lost market share rapidly in the early 1960s. While Plymouth was a styling leader from 1957 to 1958, its 1959 through 1962 models were awkwardly styled cars. Plymouth also competed with corporate sister division Dodge when the lower-priced, full-size Dodge Dart was introduced for 1960. Rambler, and then Pontiac would assume the number three sales position for the remainder of the decade. Plymouth went into a decline and never fully recovered.
The marque regained market share with the 1965 models, which returned Plymouth to full-size vehicles and more mainstream styling. Plymouth regained its traditional third place in the sales race in 1971 and 1974, primarily with its popular Valiant and Duster compact models; but Plymouth was hurt by Chrysler's financial woes of the late 1970s. Marketing decisions reduced the Plymouth lineup to the point that it was no longer a full-line brand. New models were increasingly given to the Dodge and Chrysler brands, but denied to Plymouth. By 1979, its lineup consisted of only the domestically produced Volare and Horizon models, and a number of rebadged Mitsubishi imports.
In the late 1990s, four vehicles were sold under the Plymouth name: the Voyager/Grand Voyager minivan, the Breeze mid-size sedan, the Neon compact car, and the Prowler sports car.
After discontinuing the Eagle brand in 1998, Chrysler was planning to expand the Plymouth line with a number of unique models before the corporation's merger with Daimler-Benz. The first model was the Plymouth Prowler, a modern hot rod. The PT Cruiser was to have been the second. Other than the Prowler, at the time of the takeover Plymouth had no unique products.
Furthermore, while all Plymouth dealers also sold Chryslers, many Dodge dealers sold only Dodge; thus it would cause much more dealer disarray to discontinue Dodge than it would to discontinue Plymouth. Consequently, DaimlerChrysler decided to drop the make after a limited run of 2001 models
The last new model sold under the Plymouth marque was the second generation Neon. The PT Cruiser was ultimately launched as a Chrysler, and the Prowler and Voyager were absorbed into that brand as well. (info from St. Petersburg Times and Wikipedia)
Tuesday, May 8, 2007
1830: Macintosh makes first practical raincoat
In its native Central America and South America, rubber has been collected for a long time. Early civilizations played with rubber balls, and coated clothing and footwear to make them waterproof. Rubber was used for other purposes, such as strips to hold stone and metal tools to wooden handles, and padding for tool handles.
Spanish Conquistadores were so astounded by the vigorous bouncing of the rubber balls of the Aztecs that they wondered if the balls were enchanted by evil spirits. The Maya made a type of temporary rubber shoe by dipping their feet into latex.
A story says that the first European to return to Portugal from Brazil with samples of water-repellent rubberized cloth so shocked people that he was charged with witchcraft.
When samples of rubber first arrived in England, it was observed by Joseph Priestley, in 1770, that the material was extremely good for rubbing out pencil marks on paper, hence the name.
Unfortunately, the rubber used for waterproofing by Mayas and Incas had some serious flaws, it got hard and brittle in cold weather, and soft and sticky in hot weather.
In 1823, Scottish chemist Charles Macintosh patented a method for making waterproof garments by using rubber dissolved in coal-tar naphtha for cementing two pieces of cloth together.
He took wool cloth and painted one side with the dissolved rubber preparation and placed another layer of wool cloth on top. This created the first useful waterproof fabric, but it was not perfect. It was easy to puncture, and in cold weather the fabric became stiff and in hot weather the fabric became sticky.
American inventor Charles Goodyear discovered the process of vulcanization which stabilized rubber. Macintosh used Goodyear's process to make improved fabrics, and the modern raincoat was born in 1830.
Macintosh's original intention was to make tarpaulins, but tailors started using the fabric for raincoats. The trouble was when they sewed the fabric it let rain in through the needle holes.
To save his reputation from disgruntled raincoat consumers, Macintosh started making coats with waterproofed seams. He added a tartan lining and had a rainproof coat. It was hot, however, and metal eyelets were installed under the armpits for ventilation in 1851. The original raincoats were yellow with capes, the kind still worn by police officers during rainstorms.
(info from About.com, Wikipedia, askandyaboutclothes.com)
Spanish Conquistadores were so astounded by the vigorous bouncing of the rubber balls of the Aztecs that they wondered if the balls were enchanted by evil spirits. The Maya made a type of temporary rubber shoe by dipping their feet into latex.
A story says that the first European to return to Portugal from Brazil with samples of water-repellent rubberized cloth so shocked people that he was charged with witchcraft.
When samples of rubber first arrived in England, it was observed by Joseph Priestley, in 1770, that the material was extremely good for rubbing out pencil marks on paper, hence the name.
Unfortunately, the rubber used for waterproofing by Mayas and Incas had some serious flaws, it got hard and brittle in cold weather, and soft and sticky in hot weather.
In 1823, Scottish chemist Charles Macintosh patented a method for making waterproof garments by using rubber dissolved in coal-tar naphtha for cementing two pieces of cloth together.
He took wool cloth and painted one side with the dissolved rubber preparation and placed another layer of wool cloth on top. This created the first useful waterproof fabric, but it was not perfect. It was easy to puncture, and in cold weather the fabric became stiff and in hot weather the fabric became sticky.
American inventor Charles Goodyear discovered the process of vulcanization which stabilized rubber. Macintosh used Goodyear's process to make improved fabrics, and the modern raincoat was born in 1830.
Macintosh's original intention was to make tarpaulins, but tailors started using the fabric for raincoats. The trouble was when they sewed the fabric it let rain in through the needle holes.
To save his reputation from disgruntled raincoat consumers, Macintosh started making coats with waterproofed seams. He added a tartan lining and had a rainproof coat. It was hot, however, and metal eyelets were installed under the armpits for ventilation in 1851. The original raincoats were yellow with capes, the kind still worn by police officers during rainstorms.
(info from About.com, Wikipedia, askandyaboutclothes.com)
Monday, May 7, 2007
1914: Mothers Day becomes an official US holiday
Next Sunday, May 13, Mothers' Day will be celebrated in the US and Canada; and the holiday is celebrated on other days in other countries. According to the National Restaurant Association, Mothers' Day is the most popular day to dine out. In some parts of the United States, Mothers Day marks the beginning of the tomato planting season.
Contrary to common opinion, the holiday was not invented by the Hallmark Card company. Mothers Day goes back to an ancient Greek spring festival dedicated to Rhea, mother of many deities of Greek mythology.
Ancient Romans celebrated a spring festival called Hilaria, dedicated to Cybele, a mother goddess, starting around 250 BCE. Hilaria lasted for three days in mid-March and included parades, games and masquerades. The celebrations were so raucous that followers of Cybele were banished from Rome.
Early Christians celebrated a Mother's Day of sorts on the fourth Sunday of Lent in honor of the Virgin Mary. In England the holiday was expanded to include all mothers, and called Mothering Sunday.
The American Mothers Day was copied from England by Julia Ward Howe, an activist, writer and poet famous for her Civil War song, Battle Hymn of the Republic.
Howe suggested that June 2 be annually celebrated as Mothers Day, and be dedicated to peace. In 1870, she wrote her famous Mothers Day Proclamation, a passionate appeal to women to oppose war. She also initiated a Mothers' Peace Day observance on the second Sunday in June in Boston and held the meeting for a number of years. Howe tirelessly championed the cause of official celebration of Mothers Day and declaration of official holiday on the day, but failed in her attempt to get formal recognition of a Mothers' Day for Peace.
Anna Jarvis never married or had children, but is known as the Mother of Mothers Day.
Anna Jarvis got the inspiration of celebrating Mothers Day from her mother Anna Marie Reeves Jarvis. An activist and social worker, Mom Jarvis hoped that someday someone would honor all mothers, living and dead, and pay tribute to their contributions.
Anna never forgot her mother’s words and when her mother died in 1905, she resolved to fulfill her mother’s desire of having a mothers day. Initially, Anna supplied Carnations for a church service in West Virginia to honor her mother, who favored that flower. Later Anna and her supporters lobbied for the official declaration of Mothers Day holiday. By 1911, Mother's Day was celebrated in almost every state, and in 1914 President Woodrow Wilson signed a Joint Resolution designating the second Sunday in May as Mothers Day, a day for American citizens to show the flag in honor of mothers whose sons had died in war.
Just nine years after the first official Mothers Day holiday, commercialization became so rampant that Anna Jarvis herself became a major opponent of what the holiday had become. Mothers Day is now one of the most commercially successful US holidays. (info from mothersdaycelebration.com and Wikipedia)
Contrary to common opinion, the holiday was not invented by the Hallmark Card company. Mothers Day goes back to an ancient Greek spring festival dedicated to Rhea, mother of many deities of Greek mythology.
Ancient Romans celebrated a spring festival called Hilaria, dedicated to Cybele, a mother goddess, starting around 250 BCE. Hilaria lasted for three days in mid-March and included parades, games and masquerades. The celebrations were so raucous that followers of Cybele were banished from Rome.
Early Christians celebrated a Mother's Day of sorts on the fourth Sunday of Lent in honor of the Virgin Mary. In England the holiday was expanded to include all mothers, and called Mothering Sunday.
The American Mothers Day was copied from England by Julia Ward Howe, an activist, writer and poet famous for her Civil War song, Battle Hymn of the Republic.
Howe suggested that June 2 be annually celebrated as Mothers Day, and be dedicated to peace. In 1870, she wrote her famous Mothers Day Proclamation, a passionate appeal to women to oppose war. She also initiated a Mothers' Peace Day observance on the second Sunday in June in Boston and held the meeting for a number of years. Howe tirelessly championed the cause of official celebration of Mothers Day and declaration of official holiday on the day, but failed in her attempt to get formal recognition of a Mothers' Day for Peace.
Anna Jarvis never married or had children, but is known as the Mother of Mothers Day.
Anna Jarvis got the inspiration of celebrating Mothers Day from her mother Anna Marie Reeves Jarvis. An activist and social worker, Mom Jarvis hoped that someday someone would honor all mothers, living and dead, and pay tribute to their contributions.
Anna never forgot her mother’s words and when her mother died in 1905, she resolved to fulfill her mother’s desire of having a mothers day. Initially, Anna supplied Carnations for a church service in West Virginia to honor her mother, who favored that flower. Later Anna and her supporters lobbied for the official declaration of Mothers Day holiday. By 1911, Mother's Day was celebrated in almost every state, and in 1914 President Woodrow Wilson signed a Joint Resolution designating the second Sunday in May as Mothers Day, a day for American citizens to show the flag in honor of mothers whose sons had died in war.
Just nine years after the first official Mothers Day holiday, commercialization became so rampant that Anna Jarvis herself became a major opponent of what the holiday had become. Mothers Day is now one of the most commercially successful US holidays. (info from mothersdaycelebration.com and Wikipedia)
Friday, May 4, 2007
1901: first instant coffee
Just-add-hot-water "instant" coffee was invented by Japanese American chemist Satori Kato of Chicago in 1901.
The first mass-produced instant coffee, was the invention of George Constant Washington, an English chemist living in Guatemala.
While waiting for his wife to join him for coffee, he noticed on the spout of the silver coffee pot, a fine powder, which seemed to be the condensation of the coffee vapors. This intrigued him and led to his discovery of soluble coffee. In 1906 he started experiments and put Red E Coffee on the market in 1909.
In 1938, Nestlé invented freeze-dried coffee after being asked by Brazil to help find a solution to their coffee surpluses. Nescafe was first sold Switzerland, where the company is based.
Instant coffee really took off in the mid-1950s when television watching became widespread. Commercial breaks were too short to brew a cup of tea, but long enough for instant coffee. Coffee giants like Nestlé and General Foods realized this was a big opportunity and advertised their instant coffee during the breaks. Tea companies introduced the tea bag to compete. (info from About.com)
The first mass-produced instant coffee, was the invention of George Constant Washington, an English chemist living in Guatemala.
While waiting for his wife to join him for coffee, he noticed on the spout of the silver coffee pot, a fine powder, which seemed to be the condensation of the coffee vapors. This intrigued him and led to his discovery of soluble coffee. In 1906 he started experiments and put Red E Coffee on the market in 1909.
In 1938, Nestlé invented freeze-dried coffee after being asked by Brazil to help find a solution to their coffee surpluses. Nescafe was first sold Switzerland, where the company is based.
Instant coffee really took off in the mid-1950s when television watching became widespread. Commercial breaks were too short to brew a cup of tea, but long enough for instant coffee. Coffee giants like Nestlé and General Foods realized this was a big opportunity and advertised their instant coffee during the breaks. Tea companies introduced the tea bag to compete. (info from About.com)
Thursday, May 3, 2007
200: first wheelbarrow
Apparently the wheelbarrow was not invented by some anonymous caveman, but was invented in China in the year 200, by army general Chuko Liang.
Liang's wheelbarrows were used to transport supplies and injured soldiers, and had two wheels and required two men to propel and steer. (info from "Imaginative Inventions" and About.com)
Liang's wheelbarrows were used to transport supplies and injured soldiers, and had two wheels and required two men to propel and steer. (info from "Imaginative Inventions" and About.com)
Wednesday, May 2, 2007
1931: last Rolls-Royce made in the United States
In 1884, a self-taught engineer named Henry Royce established a small business in Britain with only 70 pounds (about $100). Initially Royce produced only electrical motors and generators, but eventually built a car in 1904. That same year, Royce met London car dealer Charles Rolls.
At their first meeting, Royce took Rolls for a spin in the car. Legend has it that as he climbed aboard, Rolls asked Royce to “start her up.” Royce replied, “My dear fellow, she’s already running!” Soon after, they reached an agreement that Royce Limited would manufacture cars to be sold exclusively by CS Rolls & Company. The success of these Rolls-Royce cars led to the formation of the Rolls-Royce Company in 1906.
The charter included a provision that the company should produce engines for use “on land or water or in the air.” That same year, Rolls-Royce launched the Silver Ghost, hailed as “the best car in the world,” and opened its first US sales office in New York City.
Meanwhile, Charles Rolls had been introduced to the Wright brothers and become passionate about flight. Until his untimely death in a 1910 plane crash, Rolls worked hard to persuade his partner to venture into the aviation business. Royce preferred cars.
In 1919, Rolls-Royce recognized that the US was the most important car market in the world. Americans bought more cars each year than the rest of the world combined. A factory was built in Springfield, Massachusetts and the first chassis was completed in 1921. By 1923, Rolls-Royce presence in the US was substantial, with offices around the country.
Rolls-Royce of America manufactured nearly 3000 Silver Ghosts and Phantoms before succumbing to the Depression in 1931. To this day, Springfield is the only place outside England that Rolls-Royce cars have ever been built. (info from Rolls-Royce; photo from classic-british-cars.com)
At their first meeting, Royce took Rolls for a spin in the car. Legend has it that as he climbed aboard, Rolls asked Royce to “start her up.” Royce replied, “My dear fellow, she’s already running!” Soon after, they reached an agreement that Royce Limited would manufacture cars to be sold exclusively by CS Rolls & Company. The success of these Rolls-Royce cars led to the formation of the Rolls-Royce Company in 1906.
The charter included a provision that the company should produce engines for use “on land or water or in the air.” That same year, Rolls-Royce launched the Silver Ghost, hailed as “the best car in the world,” and opened its first US sales office in New York City.
Meanwhile, Charles Rolls had been introduced to the Wright brothers and become passionate about flight. Until his untimely death in a 1910 plane crash, Rolls worked hard to persuade his partner to venture into the aviation business. Royce preferred cars.
In 1919, Rolls-Royce recognized that the US was the most important car market in the world. Americans bought more cars each year than the rest of the world combined. A factory was built in Springfield, Massachusetts and the first chassis was completed in 1921. By 1923, Rolls-Royce presence in the US was substantial, with offices around the country.
Rolls-Royce of America manufactured nearly 3000 Silver Ghosts and Phantoms before succumbing to the Depression in 1931. To this day, Springfield is the only place outside England that Rolls-Royce cars have ever been built. (info from Rolls-Royce; photo from classic-british-cars.com)
Tuesday, May 1, 2007
1992: last new Yugo sold in the US
Introduced in the summer of 1986 at a price of less than $4000, the Yugoslavian-made Yugo was by far the lowest-priced new car available in the US at the time, and it sold very well at first. But by the early 1990s, the effects of United Nations sanctions on Yugoslavia forced manufacturer Zastava to withdraw the car from the US market.
American-sold Yugos were based on the Fiat 127, under licence from Fiat. The Yugo was imported by Malcolm Bricklin, who had previously imported the Subaru, and wanted to introduce a simple, low cost car to the US. The name is a pun on "You Go."
In the US, the Yugo developed a reputation for bad quality and low power, and was frequently the butt of jokes. Defenders of the brand argued that Detroit car makers were collaborating with influential automotive media to eliminate low-priced competition.
At first, four models were sold in the United States: the basic entry-level $3,990 GV (for "Great Value"), the nearly-identical GVL and GVS with minor trim and upholstery upgrades, and the race-inspired GVX with the 1300 cc engine, five-speed manual transmission and standard equipment including a plush interior, ground-effects package, alloy wheels and rally lights. The Cabrio convertible was introduced in 1988.
By 1990, the GV, GVL and the 1100cc engine and 4-speed manual transmission were replaced by a 1300 cc OHC engine and 5-speed manual transmission, and an optional Renault-designed automatic transmission was offered. The standard model became the GV Plus.
Wide familiarity with the Fiat 127's prowess as an autocross racer meant that many Yugo GVs were modified with Abarth racing parts and sent to participate in SCCA-sanctioned events.
The Yugo was marketed as a car for everyone, providing basic economical and reliable transportation like the Volkswagen Beetle and the earlier Ford Model T. The car was promoted as a uniquely affordable new vehicle - providing an option for buyers who would otherwise have chosen a used vehicle - and as a reliable second car for wealthier buyers. (info from Wikipedia)
American-sold Yugos were based on the Fiat 127, under licence from Fiat. The Yugo was imported by Malcolm Bricklin, who had previously imported the Subaru, and wanted to introduce a simple, low cost car to the US. The name is a pun on "You Go."
In the US, the Yugo developed a reputation for bad quality and low power, and was frequently the butt of jokes. Defenders of the brand argued that Detroit car makers were collaborating with influential automotive media to eliminate low-priced competition.
At first, four models were sold in the United States: the basic entry-level $3,990 GV (for "Great Value"), the nearly-identical GVL and GVS with minor trim and upholstery upgrades, and the race-inspired GVX with the 1300 cc engine, five-speed manual transmission and standard equipment including a plush interior, ground-effects package, alloy wheels and rally lights. The Cabrio convertible was introduced in 1988.
By 1990, the GV, GVL and the 1100cc engine and 4-speed manual transmission were replaced by a 1300 cc OHC engine and 5-speed manual transmission, and an optional Renault-designed automatic transmission was offered. The standard model became the GV Plus.
Wide familiarity with the Fiat 127's prowess as an autocross racer meant that many Yugo GVs were modified with Abarth racing parts and sent to participate in SCCA-sanctioned events.
The Yugo was marketed as a car for everyone, providing basic economical and reliable transportation like the Volkswagen Beetle and the earlier Ford Model T. The car was promoted as a uniquely affordable new vehicle - providing an option for buyers who would otherwise have chosen a used vehicle - and as a reliable second car for wealthier buyers. (info from Wikipedia)