Wednesday, February 28, 2007

2007:
Chief Illiniwek's last dance

After 20 years of pressure from activists who found their symbol offensive, the University of Illinois at Urbana-Champaign has decided to do away with Chief Illiniwek and his dance, after Illinois played Michigan in the final home game of the season.

The school's decision led the NCAA to lift sanctions that had barred Illinois from hosting postseason sports since 2005. The NCAA had deemed Illiniwek -- portrayed since 1926 by students who danced at home football and basketball games in buckskin regalia -- was an offensive use of American Indian imagery
.
But, in the eyes of orange-clad students who waited outside in chilly weather for hours ahead of the game, the decision robbed the school of a piece of its history. Jonathan Bluenke said Chief Illiniwek will be missed most on the football field, where Illini fans haven't had much to cheer about the past few years. ''If we were down by like 30, people stuck around for the chief,'' said Bluenke, who sat with another student in the cold shadow of Assembly Hall. ''Honestly, that's like what you hear in the stands.''

Under the new plan, the university still will be able to use the name Illini, because it's short for Illinois, and the nickname Fighting Illini, because it's considered a reference to the team's competitive spirit. Some people want the university to end the use of those names.

As Chief Illiniwek took the floor for the last time, a video montage of chiefs past played above the court. After the halftime dance, hundreds in arena shed their orange Illini shirts to reveal black shirts worn underneath, mourning the loss of the chief.

''To me the chief is spirit,'' said Paul Bruns, a retiree who worked for the university for 38 years. ''Why did (American Indians) dance? They danced for spirit.'' (from the Chicago Sun-Times)

Tuesday, February 27, 2007

2007:
"Planet of the Apes" preview?
Chimps observed making weapons

Chimpanzees living in the West African savannah have been observed fashioning deadly spears from sticks and using the tools to hunt small mammals -- the first routine production of deadly weapons ever observed in animals other than humans.

The multistep spearmaking practice, documented by researchers in Senegal who spent years gaining the chimp' trust, adds credence to the idea that human forebears made weapons millions of years ago.

Reserachers from a National Geographic-sponsored study discovered that chimps know how to use tools as weapons to hunt smaller mammels, a trait before now that scientists thought was only done by humans.

The landmark observation also supports the long-debated proposition that females -- the main makers and users of spears among the Senegalese chimps -- tend to be the innovators and creative problem solvers in primate culture.

Using their hands and teeth, the chimpanzees were repeatedly seen tearing the side branches off long, straight sticks, peeling back the bark and sharpening one end. Then, grasping the weapons in a "power grip," they jabbed them into tree-branch hollows where bush babies -- small, monkeylike mammals -- sleep during the day.

In one case, after repeated stabs, a chimpanzee removed the injured or dead animal and ate it, the researchers reported in yesterday's online issue of the journal Current Biology.

"It was really alarming how forceful it was," said lead researcher Jill D. Pruetz of Iowa State University, adding that it reminded her of the murderous shower scene in the Alfred Hitchcock movie "Psycho." "It was kind of scary."

The new observations are "stunning," said Craig Stanford, a primatologist and professor of anthropology at the University of Southern California. "Really fashioning a weapon to get food -- I'd say that's a first for any nonhuman animal."

Scientists have documented tool use among chimpanzees for decades, but the tools have been simple and used to extract food rather than to kill it. Some chimpanzees slide thin sticks or leaf blades into termite mounds, for example, to fish for the crawling morsels. Others crumple leaves and use them as sponges to sop drinking water from tree hollows.

But while a few chimpanzees have been observed throwing rocks -- perhaps with the goal of knocking prey unconscious, but perhaps simply as an expression of excitement -- and a few others have been known to swing simple clubs, only people have been known to craft tools expressly to hunt prey.

Pruetz and Paco Bertolani of the University of Cambridge made the observations near Kedougou in southeastern Senegal. Unlike other chimpanzee sites currently under study, which are forested, this site is mostly open savannah. That environment is very much like the one in which early humans evolved and is different enough from other sites to expect differences in chimpanzee behaviors.

Pruetz recalled the first time she saw a member of the 35-member troop trimming leaves and side branches off a branch it had broken off a tree.

"I just knew right away that she was making a tool," Pruetz said, adding that she suspected -- with some horror -- what it was for. But in that instance she was unable to follow the chimpanzee to see what she did with it. Eventually the researchers documented 22 instances of spearmaking and use, two-thirds of them involving females. (from the Washington Post. Photo by Paco Bertolani)

Monday, February 26, 2007

1891:
beginning of the end for telephone operators

For most of early telephone history, calls were completed by young men and women working at the phone company central office, where they used plug-in cords to connect callers to the people and businesses they wanted to communicate with.

Almond Strowger was an undertaker/inventor in Kansas City, MO, who was described as "eccentric, irascible and even mad." He was motivated to invent an automatic telephone system after having trouble with local Bell Telephone operators.

He thought the operators were sending calls to a competitor rather than to his business. The origin of this suspicion reportedly arose from an incident when a friend died and the family contacted a rival undertaker. Other stories claim that the wife or cousin of a competing undertaker was a telephone operator and Strowger suspected that the operators were telling callers that his line was busy or connecting his callers to the competition. Yet another story has him boasting of inventing "the girl-less, cuss-less telephone."

Convinced that callers -- not operators -- should choose who was called, Strowger first conceived his invention in 1888, and patented the automatic telephone exchange in 1891.

The patent consists of:

1. A device for use by the customer, creating trains of on-off electric pulses corresponding to the digits 0-9. This equipment originally consisted of two telegraph keys, and evolved into the rotary dial telephone.
2. A stepping switch at the telephone exchange. A rotating arm steps over, in a semi-circular fashion, 10 possible contact points. The stepping motion was controlled by the power pulses coming from the originating customer's telegraph keys, and later from the rotary dial.

Cascading enabled connection among more than 100 customers. Additional switches can be positioned vertically and horizontally to increase the switching capacity.

The almost unlimited potential for expansion gave the Strowger system a big advantage. Previous systems had a fixed number of subscribers to be switched directly to each other in a mesh arrangement. This became geometrically more complex as new customers were added, as each new customer needed a switch to connect to every other customer. In modern terminology, the previous systems were not scalable.

The Strowger Automatic Telephone Exchange Company was formed, and it opened its first exchange in La Porte, Indiana in 1892, with about 75 subscribers. It took the Bell Telephone System a long time to become convinced that electronics made more sense than humans for completing local calls, and Bell finally licensed Stroger's technology in 1924.

Strowger sold his patents for $1,800 in 1896, and they were resold for $2.5 million in 1916. In 1898, he got $10,000 for his share in the Automatic Electric Company, which later became GTE, which became part of Verizon. (Photo shows operators at the Roseburg Telephone and Telegraph Co. circa 1910, from Douglas County OR Museum) (Info from Wikipedia & other sources)

Wednesday, February 14, 2007

On Vacation


We'll be away for a bit of relaxation and recharging. New posts will resume on 2/26. If you miss me, you can read all of the posts on all of my blogs.

Tuesday, February 13, 2007

1935: Hollywood's last silent film

The first commercial screening of movies with fully synchronized sound took place in 1923, and the first feature-length movie originally presented as a talkie was The Jazz Singer, released in 1927; but silent films continued to be made into the next decade.

The last silent film ever produced in Hollywood was released by Paramount International in 1935. Legong: Dance of the Virgins, was originally shown only outside the US due to concerns about female nudity in the film and the uproar it would cause. It was fiilmed in Bali, Indonesia.

The movie is a tragic tale of love denied. Poutou, a young girl who is a respected Legong dancer, falls in love with young musician Nyoung. Her father is delighted with Poutou's choice and wants to help her to conquer Nyoung's heart. But Poutou's half sister Saplak also wants Nyoung, and when he chooses Saplak, Poutou drowns herself. The movie displays Balinese culture including frenetic dances, mystical parades, the local marketplace, a cockfight and a mass cremation.

Finally in the late 1930's it was shown in theaters in Hollywood and New York City attracting thousands to see bare-breasted native girls. (info from BaliFilm.com, Milestone Films and Wikipedia)

Monday, February 12, 2007

2007:
first female president at Harvard

Harvard University on Sunday named historian Drew Gilpin Faust as its first female president, ending a lengthy and secretive search. The seven-member Harvard Corporation elected Faust, a noted scholar of the American South and dean of Harvard's Radcliffe Institute for Advanced Study, as the university's 28th president.

Faust, 59, recognized the significance of her appointment.

"I hope that my own appointment can be one symbol of an opening of opportunities that would have been inconceivable even a generation ago," Faust said at a news conference on campus. But she also added, "I'm not the woman president of Harvard, I'm the president of Harvard."

With Faust's appointment, half of the eight Ivy League schools will have a woman as president. The others are: Amy Gutmann of the University of Pennsylvania, Shirley M. Tilghman of Princeton University, and Ruth J. Simmons of Brown University.

The Ivy League schools had only male students until the early 1970s. The Harvard class of 2008 was the first where women outnumbered men in gaining admission under the Early Action program.

"This is a great day, and a historic day, for Harvard," said James R. Houghton, chairman of the presidential search committee.

Faust is the first Harvard president who did not receive an undergraduate or graduate degree from the university since Charles Chauncy, an alumnus of Cambridge University in England, who died in office in 1672. She attended Bryn Mawr College and the University of Pennsylvania, where she was also a professor of history.

The Harvard presidency is perhaps the most prestigious job in higher education, offering a pulpit where remarks resonate throughout academic circles; and unparalleled resources, including a university endowment valued at nearly $30 billion.

Born to a privileged family in Virginia's Shenandoah Valley, Faust wrote that a conversation at age nine with the family's black handyman and driver inspired her to send a letter to President Eisenhower pleading for desegregation. She then began to question the rigid Southern conventions where girls wore "scratchy organdy dresses" and white children addressed black adults by their first names.

"I was the rebel who did not just march for civil rights and against the Vietnam War but who fought endlessly with my mother, refusing to accept her insistence that 'this is a man's world, sweetie, and the sooner you learn that, the better off you'll be,'" she writes. (info from WJLA.com and Harvard University Gazette)

Friday, February 9, 2007

1952:
invention of first TV-advertised toy

Mr. Potato Head started as a breakfast cereal premium, developed by New York inventor George Lerner. Just before 1950, he designed and produced a first generation set of plastic face pieces with pins that could be pushed into fruits or vegetables to transform the food into an endless array of playmates.

The toy wasn't successful. There was still a World War 2 mentality to conserve resources. Toy companies didn't think that customers would accept the idea of wasting a piece of food as a child's toy. But after awhile, George finally sold the toy to a company that planned to use the pieces as a giveaway in cereal boxes.

George thought that his new toy deserved a bigger opportunity, and it came from Hasbro, a New England toy manufacturer.

Henry and Merrill Hassenfeld were the second generation of brothers to run their family’s business. Although their roots were in textiles, they also enjoyed success making pencil boxes with surplus book binding fabric. They soon found that the boxes sold better when they were filled with pencils and other school supplies. Merrill experimented with filling the boxes with small toys instead of school supplies, and the idea took off. They began making their boxes into doctor kits, nurse kits, paint sets and even junior air raid kits.

In 1951, George Lerner approached Merrill with a set of toy face pieces as an idea to fill one of their boxes. Merrill loved the idea of making funny faces with fruits and vegetables, and bought the toy from the cereal company. It became their first huge toy hit, and made Hasbro a powerful toy brand.

Mr. Potato Head was the first toy to be advertised on television.

The original Mr. Potato Head contained only parts, such as eyes, ears, noses and mouths, and parents had to supply children with real potatoes for face-changing fun! Eight years later, a hard plastic potato "body" was included with Mr. Potato Head to replace the need for a real potato. Over the next three decades, a variety of Mr. Potato Head products were sold. He was so loved by children, that he was expanded into additional toy categories including puzzles, creative play sets, and electronic hand-held, board and video games. The vast popularity of Mr. Potato Head also attracted non-toy companies who licensed his image and name to make apparel, accessories and novelty items.

In 1985, he received four write-in votes in the mayoral election in Boise, Idaho. Mr. Potato Head's appeal to people young and old made him the ideal ambassador for many causes and good-will efforts. In 1987, Mr. Potato Head surrendered his signature pipe to the U.S. Surgeon General, C. Everett Koop, and became the "spokespud" for the American Cancer Society's annual "Great American Smokeout" campaign -- a role he carried out for several years. On his 40th birthday, it was decided that he would no longer be a "couch potato" and he received a special award from the President's Council for Physical Fitness, right on the lawn of the White House! Always one to pass on a wholesome message to the public, he and Mrs. Potato Head joined up with the League of Women's Voters in 1996 to help out with their "Get Out the Vote" campaign and spread the word about the importance of voting to Americans.

Mr. Potato Head's celebrity status carried him to Hollywood in 1995, where he played a supporting role in Disney's "Toy Story." And, in 1997, Burger King hired him to be the spokespud for the introduction of their new french fry and "Try the Fry" campaign. In the fall of 1998, he starred in his own Saturday morning children's television program, "The Mr. Potato Head Show," on the Fox Children's Network. (info from Hasbro, mrpotatohead.net, and About.com)

Thursday, February 8, 2007

1861:
first Unisex hair salon in Nevada

Until the mid 1960's, most American men had their hair cut in barber shops, and most American women had their hair "done" or "styled" in beauty parlors.

At the height of the hippie era, when men let their hair grow long, some beauty salons starting treating the tresses of both genders, and proclaimed themselves to be "Unisex" salons, to welcome men.

While it was their intention to imply that they provided the same services for both males and females, the terminology was inaccurate. A unisex salon would be one that provided service for members of just one sex. A business that served both men and women should be considered bisexual... but a large neon sign that promoted bisexuality might not draw many customers.

Actually, there was at least one "we clip anyone" salon long before the hippie era. In 1861, Arthur's Tonsorial Parlour in Virginia City, Nevada provided service for short-haired women during the population boom caused by the Comstock Lode silver discovery. Virginia City was such a boomtown that for a long time it housed the only elevator to be found from California to Chicago.

Virginia City is one of the oldest cities in Nevada, and also one of the oldest west of the Mississippi River. It is one of the most famous boomtowns in the Old West as it virtually appeared overnight as a result of the silver discovery in 1859. At its peak, Virginia City had a population of nearly 30,000. When the Comstock Lode ended in 1898, the city's population declined sharply. Today, Virginia City is but a shadow of its former glory, with about 1,500 people.

Virginia City could be considered the birthplace of Mark Twain, because in 1863 Samuel Clemens, then a reporter on the local newspaper, first used his famous pen name. Virginia City is also known for being the nearest town to the Cartwright Ranch on the Bonanza television series. It is also the name and the setting of a 1940 Errol Flynn movie set during the civil war, and the place where Marty McFly, the lead character in the Back to the Future trilogy, was killed. (info from Wikipedia and other sources)

Wednesday, February 7, 2007

1998:
last MacIntosh clone

Despite its overwhelming success with the iPod and iTunes, Apple has never achieved a big share of the personal computer market. According to some industry observers, this is because the company refused to license its technology to other companies, who could have built Mac-like machines, increasing ownership of Mac-compatibles, which would have stimulated software development and growth of the Mac market.

There was a brief period in the late 1990s, when Apple did permit Mac cloning, and it became an $800 million business, with computers made by Motorola, Power Computing and others, based on Motorola and IBM chips.

Apple execs said that Mac compatibles didn't boost the market -- they only took business from Apple.

Apple canceled the program in 1998, leading to low closeout prices on Motorola and Power Computing clones that ironically hurt Apple even more.

Many Mac fans professed loyalty to cloners, which typically offered better performance at lower prices than Apple.

The top Mac clonemaker, Power Computing, spent heavily to build up production, and grabbed 9.5% of the Mac market. Low on cash, Power planned an initial public offering, but couldn’t proceed because of uncertainty over arrangements with Apple. Power halted construction of a $25 million headquarters in Texas, and its president, former Dell Computer marketing wiz Joel Kocher, quit abruptly after directors rejected his plan to fight Apple in court and in the market with a price war. (info from Business Week & Cnet)

Tuesday, February 6, 2007

1982: first portable pc, first IBM PC clone

Compaq is now the low-end brand name used by H-P, but Compaq Computer Corporation has in important place in PC history. Compaq was founded in 1982 by three men from Texas Instruments who invested $1,000 each to form their own company. Sketched on a paper place mat in a Houston pie shop, the first product was a "compact" portable personal computer.

The Compaq Portable was the first 100% compatible IBM computer clone. It could run the software written for IBM’s PCs, which was a major achievement at the time. Compaq couldn't just copy IBM's BIOS (Basic Input/Output System, the internal software that determines what a computer can do), to make their new machine guaranteed IBM compatible. It would be illegal, and easily proven by IBM. Compaq determined to reverse-engineer IBM's BIOS, and used two sets of programmers, one group who had access to IBM's source code and another that knew nothing about it.

The first group analyzed the original code, and made notes of exactly what it did. The second group analyzed the notes, and wrote their own BIOS that performed identically. It took one year and a million dollars to accomplish.

More than a mere IBM clone, the Compaq Portable was something different, it was transportable, designed so it can easily be taken aboard an airliner as carry-on luggage. The machine was very successful for Compaq and the company took in $111 million in its first year, a record in American business.

This precursor of today’s lightweight laptops and palmtops, weighed 28 pounds, had a 9-inch monochrome display, and cost $3590 with two 5-1/4” floppy drives and 640K of RAM. A basic version with just one drive and 128K of RAM, sold for $2995, considered a bargain compared to IBM prices at the time. (info from OldComputers.net and Byte magazine)

Monday, February 5, 2007

1930s:
invention of the fried clam strip

Many people think that clam strips are cut away from the yucky parts of "whole belly" soft shell clams, that are commonly used for steaming or frying. Clam strips are actually slices of the "foot" of large sea clams. They can be up to nine inches in diameter, and are found from the shoreline to about 120 feet down.
Although people have been eating clams for centuries, fried clam strips have only been around for about 75 years.
Thomas Soffron, a clam digger and businessman from Ipswich, Mass., invented the clam strip, and was the first to market it. Soffron was a finicky eater, and when served steamed clams, he wouldn't eat the neck and he wouldn't eat the belly -- he just ate a sanitized strip.
Soffron later dug some hard shell clams, and tried slicing the digging foot of the clam into eighth-inch strips and fried them. He liked the sweet taste and thought he discovered a food with broad appeal, that could travel better than the soft-shell clams that were dug closer to shore. Tom Soffron and his brothers went into business in 1932 to market clam strips to people who had never eaten clams before. The business took off when they met another young entrepreneur named Howard Johnson, in the 1940s.
Johnson was opening roadside restaurants in New England. The brothers tried to sell him their clam strips. He tasted, smiled, and a deal was done. At that time, few people outside of Ipswich even knew clam strips existed.
When the Soffron Brothers and Howard Johnson's joined forces, the timing was perfect for both to realize tremendous growth and success. Right after World War II, the country was building the interstate highway system and Howard Johnson's would eventually create hotels and restaurants from coast to coast (sadly, there are only about five left now), and the "Tendersweet Fried Clams" became a nationwide favorite. The Soffrons once operated seven processing plants, from Maryland to Nova Scotia.
Soffron was born in Kalamata, Greece. When he was an infant, his family immigrated to the United States and eventually settled on a farm in Ipswich. During the Great Depression, Soffron moved to New York City and worked in hotel restaurants. In 1938, he returned to Ipswich and started digging clams.
Soffron was also the lead singer and guitarist with Talambekos Mandolinata, a string band that performed at Greek social events in New England and New York in the 1940s and '50s, and also made commercial recordings. Soffron died on Feb. 21, 2004 at age 96 in Ipswich, his hometown.
(Lots more about clams at WeLoveClams.com)

Friday, February 2, 2007

1973:
egg mcmuffin, the perfect portable breakfast

People have been eating breakfast sandwiches (lox on a bagel, egg on a roll) for a very long time. But the notion of a portable morning meal that could be eaten from one hand while the other hand clutches the steering wheel, really took off in 1973, when McDonald's introduced the Egg McMuffin. It also put McDonalds into the breakfast business.

Egg McMuffin is a slice of Canadian bacon (which doesn't have to come from Canada any more than French Fries must be imported from France), a grilled egg, and a slice of cheese on an English muffin. A Sausage McMuffin with Egg is a popular variation that uses a sausage patty in place of the Canadian bacon. In the UK, the McMuffin is available without bacon for vegetarians. People order a "Bacon Egg McMuffin" to get the equivalent of the US version.

Former McDonald's President Ray Kroc wrote that Herb Peterson, the operator of their Santa Barbara franchise, asked him to look at something, not telling him what it was because it was "...a crazy idea, a breakfast sandwich. It consisted of an egg that had been formed in a Teflon circle with the yolk broken, and was dressed with a slice of cheese and a slice of grilled Canadian bacon. It was served open-faced on a toasted and buttered English muffin."

Peterson believed that to launch an entirely new food line, such as breakfast, McDonald's needed something unique and yet something that could be eaten like all other McDonald's items -- with the fingers. His solution came when he started modifying an Eggs Benedict sandwich that was being marketed by Jack-in-the-Box.

By Christmas of 1971, Peterson had been working on the product for months. He had experimented with prepackaged Hollandaise, which he rejected as too runny. He replaced it instead with a slice of cheese, which when melted on a hot egg produced the consistency he was looking for. He also had to develop a foolproof way of preparing an egg on a grill to give it the appearance of a poached egg. Poaching eggs did not fit with McDonald's assembly line production process, but Peterson solved the problem by developing a new cooking utensil -- a cluster of six Teflon-coated metal rings -- that was placed on the grill to give eggs a round shape to match an English muffin. When he added grilled Canadian bacon, Peterson had a breakfast product perfect for a sandwich-oriented fast-food chain.

In 1971, Kroc was celebrating Christmas at his ranch near Santa Barbara, and Peterson asked him to stop by the store.

Peterson was ready with a demonstration of his new product, and a flip-chart to explain its economics; but it was taste, not economics that convinced Kroc. He had just finished lunch before seeing Peterson, but he devoured two of the new sandwiches anyway. At Kroc's request, Peterson took his Teflon rings to Chicago to prepare his new breakfast for the rest of McDonald's senior managers, all of whom responded as positively as Kroc had.

McDonald's was ready to test the product nationwide, as soon as it settled on a name. Peterson favored calling it McDonald's Fast Break Breakfast, but the name had been copyrighted, but never used, by Nabisco. One evening, Mr. and Mrs. Kroc were having dinner with McDonald's exec Fred Turner and his wife Patty. Patty Turner suggested calling it the Egg McMuffin, and the name stuck.

EggMcMuffin has 300 calories, the fewest of any McDonalds breakfast item. (info and some direct quotes from MCDONALD'S: BEHIND THE ARCHES by John F. Love, Wikipedia, & other sources).

Thursday, February 1, 2007

2006:
ONE BILLION cellphones sold

According to marker research firm IDC, the worldwide cellphone market reached a new milestone in 2006, with more than one billion phones sold by manufacturers.

The actual total, according to IDC, was 1.019 billion phones, or 22.5% more than the 832.8 million units sold in 2005. For the quarter ending December 31, 2006, phone makers sold 294.9 million units, or 19.7% more than the 246.4 million units they shipped during the last quarter of 2005, making a sales record for a single quarter.

An IDC spokesman said, "It was not long ago that shipments into mature markets, including Japan, North America, and Western Europe, consumed the majority of devices shipped worldwide. More recently, however, device shipments into emerging economies in Asia/Pacific, Central and Eastern Europe, the Middle East, Africa, and Latin America have surpassed shipments to mature markets, and the difference between the two continues to grow."