Tuesday, February 27, 2007

You Are What You Hear

iPod users, do those pesky earpad cords get tangled when you dance? Tired of the incessant banging and tickling of the cords against your sensitive skin as you air-guitar? Do your wireless Bluetooth earphones keep getting interference? Well never fear! Now through the miracle of low-level electrocution, you can send the music right through your body from your mpeg player to your ears!

Story: http://www.newscientisttech.com/channel/tech/dn10663

Patent: HERE

Throw away those cords! Now Sony has developed a system where audio signals are sent through a conductive cloth pad directly into your skin. At a few millionths of an amp, the signal travels through the capacitor – your body – to come out at specially-designed earpads. The 500 kHz -3 MHz signal can carry 48 kilobytes per second. Frankenstein's monster only received life from the electricity sent through him. You can have the Rolling Stones! You won't feel a thing except rockin' rhythms as you dance to your body-conducted tunes. Look, Ma, no wires!

Order now!

Monday, February 26, 2007

The Medicinal Power Of Moonlight and Pig Bladders

My mom is a firm believer in New Age beliefs. She wrote a book on Nostradamus, reinterpreting his verses for predictive ability (HERE is a link to her book). She also wrote a pamphlet entitled "The Power of Crystals," as in quartz crystals having energy of their own that can be tapped for special supernatural uses. I can't say I always share her beliefs, but they are at least entertaining and certainly point to her open-mindedness. She's a regular reader of this blog, after all (but, then, doesn't everyone's mom read their kids' stuff, even if it's dreck like this?). I must admit a good deal of fascination in "alternative" views of reality, including UFOs, ghosts, and predictions, with a degree of irrational belief. If even 1% of it is true, most of these things would be groundbreaking and mindwarping.

I can't scoff at my mom, though. When I was a teenager I had a nasty wart growing on a finger for more than a year. One night, when there was a full moon, she convinced me she could remove it by casting a sort of spell. Smiling, we went outside, and in the light of the moon she rubbed the wart in some special way. I can't remember if she said anything while doing this (Mom, maybe you could leave a comment and describe the procedure!). Laughing, I went back to whatever I was doing. But within a couple weeks the wart sloughed off and never came back. Coincidence? The scientist in me says yes, but I admit a certain degree of bewilderment.

A recent news report describes something less magical, but just as interesting. Think, for a minute, of shaman remedies where a wound is treated with extract from some forest plant, and you'll be on the right path.

Animals such as salamanders are able to re-grow limbs and tails that have been cut off. Even human fetuses have a similar regenerative ability. Science is still figuring out how. Then one day a researcher was trying to replace a dog's aorta with a piece of intestine. The wound healed amazingly fast and the intestine re-formed into a sort of aorta. After years of study, it became apparent that extracellular matrix from the intestine had guided regeneration of the tissue. Extracellular matrix (ECM) of this sort is found in a number of other organs, such as bladder. Because ECM is cell-free, it isn't rejected by the body's immune responses.

As a result of research, ECM patches have been used for a number of years to help heal torn rotator cuffs, hernias, and in veterinary medicine. Now there is a report describing how ECM is able to grow back missing tips of fingers by applying extract of pig bladder:

here: http://news.yahoo.com/s/ap/20070219/ap_on_sc/regrowing_fingers_4

A company called ACell, formed by the researchers who first discovered the technique, is testing the product for many such uses. A brother of one of ACell's founders accidentally cut off the tip of his finger in 2005 by sticking it into a moving hobby plane propeller. In his words, "I pointed to it and said, 'You need to get rid of this engine, it's too dangerous.' And I put my finger through the prop." Oops! Talk about dumbass mistakes.

Doctors told him his fingertip was lost forever, but he went to his brother at ACell and received a paste of ECM from pig bladder, which he applied to his finger tip every two days for four weeks. Now the finger tip has grown back. In fact, the nail grows at twice the rate of the other fingers, and the skin on the tip doesn't crack from cold weather like his other fingers!

Now the military is testing this technique on missing fingers of soldiers from Iraq. If they can manage to grow back even a slight stump or digit to pinch with, their lives will be far better than the alternative: http://webreprints.djreprints.com/1646641311092.html.

It ain't magic, but certainly it is the application of something we still don't fully understand. In the words of Arthur C. Clarke, "Any sufficiently advanced technology is indistinguishable from magic," but I would add that those technologies need not be "advanced" in the sense of novel physics or computer power. Sometimes the most advanced technologies are reapplications of what nature has already given us, or what our forebears already figured out and modern society forgot.

So is there a scientific explanation for my Mom's wart-removal technique, or is it just hocus-pocus? Will science one day re-embrace the old "eye of newt and bladder of pig" philosophy of Medieval medicine? In any case, Mom, keep letting me in on your alternative views. Science, shaman magic, and New Age beliefs all have one thing in common, they constantly revisit and re-adjust our concept of reality in search of the elusive truth.

By the way, if you've got any warts, the next full moon is March 3rd ….

Friday, February 23, 2007

Critical Thinking And LEGO Robots

It's amazing what can happen if people think.

Yesterday I had the pleasure of watching some local school kids demonstrate their LEGO robots. They were part of a team from an area middle school. These bright, culturally-diverse 6th, 7th, and 8th grade boys (and a girl) competed on a state level and won awards for their ability to build a robot of their own design to perform specific tasks that the competition demanded (such as traveling a short distance to trigger devices or drop an object into a specific location), all out of parts from LEGO robotics kits. Though the obstacles the robot had to overcome were pre-defined, their robot design and programming was formulated on their own, with minimal assistance from the team coach and high-school kids who acted as mentors. Maybe they could have learned something by being told exactly how to build and program the robot from a specific blueprint, but I guarantee they learned more about robotics and, generally, critical thinking, by coming up with their own design and testing it.

Beats the hell out of the "spaceships" I built with LEGOs at their age!

Education is a marvelous thing, as it gives you a toolbox of knowledge from which consider the world (like a box of LEGO blocks), but that knowledge is useless if you can't learn to apply it creatively (like making a friggin' robot out of them). I think back to my organic chemistry classes as an undergrad, for instance, where I was forced to memorize very complex chemical reactions and structures, only to regurgitate them on the next test before cramming my brain with more information. Within a year or two of taking the class I doubt I could have remembered even 10% of what I learned. A decade later I might have recalled only a few bits and pieces. As a scientist, I have been taught to think critically, not just about science questions, but about all things. It's in my nature to question things (even if it makes me look cynical in the process), but as my Organic Chem class illustrates, even science has obstacles to overcome.

Recently, a professor at Ohio State University tested the role of critical thinking on his introductory-level biology students:


The 300 or so students were taking a lab class on the role of enzymes in biology, and fell into two groups. The first group of students was given prepared enzymes and step-by-step instructions on how to test them. The second group was given a raw turnip from which they had to extract the enzymes themselves and then come up with a plan of their own to test them, exercising their critical and creative thinking. In the end, they were asked a simple question, "Where do enzymes occur in nature?" The correct (and simplistic) answer: "In living tissue." Only 23% of the "step-by-step" group got it correct, compared to 83% of the "critical thinking" group.

Said the professor, "The students in the first group were just as intelligent as those in the second group. They just lacked confidence. No teacher had ever asked them something as simple as how do they want to display what they saw in the experiment. They had always been told how to do that. Educators thought they were doing students a wonderful favor by giving them step-by-step instructions."

And that was at the college level! How often do you think the average grade school student in America is asked to think critically about the information they are taught? How much is rote memorization? In this day of standardized testing, I'm doubtful critical thinking raises its shy head, even in science classes.

And we wonder why people adhere to horoscopes, latch onto the latest fad diets, or believe anything presented on the evening news as undeniable fact. Come on, folks! Critical thinking and creativity drive innovation and reveal the truth behind the veils of ignorance. Think back to your favorite class in school. I'd bet my left thumb that it was one where you got to be creative and didn't have to cram your brain with memorized details, yet learned a lot.

When I saw those kids and their robots, hope sprang eternal. Will they now apply the abstract lesson they learned and raise their hands more in class, questioning why the teacher said what she did?

The 8th-grader in me is dying to get one of those LEGO robot kits.

Thursday, February 22, 2007

African-American Scientists: Daniel Hale Williams

For the last of my Black History Month tributes to African-American Scientists, I spotlight Dr. Daniel Hale Williams, a pioneer in turn-of-the-century surgery and sterile procedure, founder of early African-American and integrated hospitals, and instructor of medicine.

A good bio from which much of this was taken:


Williams was born to a mixed-race family in Hollidaysburg, Pennsylvania, January 18, 1856, the fifth of seven children. His father, also named Daniel Williams, a white man, was an active abolitionist. Daniel's mother was a free Black woman, Sarah Price Williams. Daniel's father was a barber and moved the family to Annapolis, Maryland but died shortly thereafter of tuberculosis, when Daniel was 11.

Although some members of the family lived as whites, and he could also have done so, Daniel refused to "pass" and actively identified himself as Black. Soon after his father died his mother sent her children to live with different relatives, except Daniel, who was apprenticed to a shoemaker in Baltimore, while she went to live in Illinois. After a while Daniel left his apprenticeship and followed her, but although the reunion was happy, his mother soon moved to Maryland with his sisters to rejoin the other children, and Daniel elected to stay in Illinois.

For the next several years he worked and lived with various cousins, but when he was 16 he struck out on his own and moved to Wisconsin, where he became a barber, living very happily with his employer's family, and also attended high school. His employer-cum foster father later financed his medical training at Northwestern University Medical School (known at the time as the Chicago Medical College). Initially, Williams was apprenticed to a well-known Civil War surgeon for the Union, Dr. Henry Palmer. Williams graduated in 1883.

Because of primitive social and medical circumstances existing in that era, much of Williams early medical practice called for him to treat patients in their homes, including conducting occasional surgeries on kitchen tables. In doing so, Williams utilized many of the emerging antiseptic, sterilization procedures of the day and thereby gained a reputation for professionalism. He was soon appointed as a surgeon on the staff of the South Side Dispensary and then a clinical instructor in anatomy at Northwestern. In 1889 he was appointed to the Illinois State Board of Health and one year later set for to create an interracial hospital.

On January 23, 1891 Daniel Hale Williams established the Provident Hospital and Training School Association, a three story building which held 12 beds and served members of the community as a whole. The school also served to train Black nurses and utilized doctors of all races. The hospital's success rate was phenomenal considering the financial and health conditions of the patient, and primitive conditions of most hospitals. Much can be attributed to Williams insistence on the highest standards concerning procedures and sanitary conditions.

Williams is perhaps best known for a surgery he performed at Provident Hospital in 1893. Internal surgery was almost unheard of at the time due to the high risk of infection. When a man came in who had been stabbed in the chest, Williams took the initiative to open the chest and perform surgery, suturing a cut through the pericardium (sac around the heart), then applying antiseptic procedures before closing. Cured, the patient walked out of the hospital 51 days later and lived another fifty years. Technically this isn't an open heart surgery, and similar surgeries had been performed in Europe on at least a couple occasions over the hundred years prior, yet Williams is often credited with "the first open heart surgery."

In February 1894, Daniel Hale Williams was appointed as Chief Surgeon at the Freedmen's Hospital in Washington, D.C. and reorganized the hospital, creating seven medical and surgical departments, setting up pathological and bacteriological units, establishing a biracial staff of highly qualified doctors and nurses and established an internship program. Recognition of his efforts and their success came when doctors from all over the country traveled to Washington to view the hospital and to sit in on surgery performed there. Almost immediately there was an astounding increase in efficiency as well as a decrease in patient deaths.

During this time, Williams married Alice Johnson and the couple soon moved to Chicago after Daniel resigned from the Freedmen's hospital. He resumed his position as Chief Surgeon at Provident Hospital (which could now accommodate 65 patients) as well as for nearby Mercy Hospital and St. Luke's Hospital, an exclusive hospital for wealthy White patients. He was also asked to travel across the country to attend to important patients or to oversee certain procedures.

When the American Medical Association refused to accept Black members, Williams helped to set up and served as Vice-President of the National Medical Association. In 1912, Williams was appointed associate attending surgeon at St. Luke's and worked there until his retirement from the practice of medicine.

Upon his retirement, Daniel Hale Williams had bestowed upon him numerous honors and awards. He received honorary degrees from Howard and Wilberforce Universities, was named a charter member of the American College of Surgeons and was a member of the Chicago Surgical Society.

Williams died from a stroke on August 4, 1931, in Idlewild, Michigan, having set standards and examples for surgeons, both Black and White, for years to come.

Tuesday, February 20, 2007

Plugging The Volcano

Oh man! Do you remember my post about the mud volcano in East Java, where a natural gas-exploration company used faulty practices in their drilling and wound up creating a mud-spewing volcano that has since displaced 13,000 people and buried four villages? Well, make that 15,000 people, and the mud is still spreading. It has now threatened a major railway.

Even though leading geologists have published a report showing the catastrophe was caused by faulty drilling techniques, and Indonesian President Susilo Bambang Yudhoyono has ordered the company to pay restitution, the drilling company, PT Lapindo Brantas, and Indonesian welfare minister Aburizal Bakrie, whose family owns the company, still claim the company is innocent and that the catastrophe is due to natural causes. Efforts to divert the mud flow to a local river have failed.

Here is a YouTube video of the volcano and the attempts to build diversion levees:


Geologists suggest further attempts to divert the flow, to the sea, but the Indonesian government has other plans: Plug the hole!


That's right. Their plan is to lower 2,000 high-density concrete balls into the hole of the volcano, thinking that the balls will slow the flow by 50 – 70%. A local geologist says it is doomed to failure and that the balls will likely be pushed back out.

Seems sorta like the story of the Dutch boy putting his finger in the dike, eh?

Saturday, February 17, 2007

It May Be Safer To Lick The Office Toilet Seat Than Your Messy Desk

Let's face it, people are slobs. Even at work, most of us aren't the picture of organizational and hygienic excellence. Take MY office, for instance. I've got stacks of lab books, experimental notes, research journals, and assorted paperwork piled on either side of my computer. Occasionally I have to shove it out of the way just to make room for my mousepad. There are a few areas, such as the space behind my monitor (which sits on the joint of my L-shaped desk) where dust settles and hasn't been cleaned since the government had a budget surplus. I usually eat at my desk, so there are sometimes crumbs laying around. I usually have one or two empty cans of Pepsi sitting on the desk, and I have my share of snacks in my desk drawer, including some tea bags, a box of Raman noodles (for an emergency lunch option), some packaged fruit leather strips, and Altoids ("curiously strong" for my curiously strong bad breath). It's been at least a couple months since I wiped down my keyboard and mouse with an alcohol wipe, and that was just because I had been sick and had to share my computer with someone one afternoon.

Despite my organization and hygiene, I'd still rank myself as about average (well, okay, maybe a little worse than average). Most of us keep snacks in our desks and have at least one good stack of unfiled paperwork. Being an office-eater does take some skill, though I prefer eating out when possible (as I've remarked before: HERE).

Recently a study came out that found the average office desk has a higher bacterial count than the average office toilet:

That's right. Swabs of office equipment and belongings have more bacteria than the porcelain throne. Personally, I find it a bit alarming that my desktop has 400 times more bacteria than the spot where I and my colleagues plant our naked, pimply asses.

Interestingly, though women's desks were more organized, they were three to four times more bacteria-laden. The authors believe this is due in part to the fact that women were more likely to have snacks in their desks than men (75% of women), had cosmetics and hand lotions which could harbor bacteria, and were more likely to interact with young children (which, as I can assure you from personal experience, are little illness-incubators). But before us men can become too cocky about this result, we should note that the study found men's wallets to be the single worst item for bacterial concentration.

So far I haven't learned what species of bacteria were found. I'd say there's a pretty good chance most of them are benign. Remember, not all bacteria are "bad" bacteria.

The authors went on to report that desks that are regularly disinfected have 25% fewer bacteria. They suggest disinfecting once a day. Not likely, given my hectic schedule, but at least once a month would be a step up for me.

So the next time I head to my second office (the one with the flushable office chair and tiled floor), I'll remember this report. Maybe it will spur me to clean my office more often.

Or maybe I'll just eat my Raman noodles in the bathroom stall.

Thursday, February 15, 2007

African-American Scientists: Shirley Ann Jackson

Continuing my celebration of Black History Month, this week's featured African-American scientist is Shirley Ann Jackson, a theoretical physicist, world expert in nuclear regulation, and current president of Rensselaer Polytechnic Institute.

A good profile: http://www.rpi.edu/president/profile.html

Dr. Jackson is the first African-American woman to receive a doctorate from M.I.T. — in any subject. She is one of the first two African-American women to receive a doctorate in physics in the U.S. She is the first African-American to become a Commissioner of the U.S. Nuclear Regulatory Commission. She is both the first woman and the first African-American to serve as the chairman of the U.S. Nuclear Regulatory Commission, and now the first African-American woman to lead a national research university. She also is the first African-American woman elected to the National Academy of Engineering.

Shirley Jackson was born in Washington, D.C., in 1946. Strongly supported by her parents, she excelled in school, attending accelerated classes in math and science, and graduating in 1964 as valedictorian. She immediately entered M.I.T., studying theoretical physics while volunteering at the Boston City Hospital and the YMCA. Four years later she graduated with her bachelors degree, writing her dissertation on solid-state physics (which was at the forefront of theoretical physics at the time). Although accepted at Brown, Harvard, and the University of Chicago, Jackson decided to stay at MIT for her doctoral work, because she wanted to encourage more African American students to attend the institution. She earned her Ph.D. in elemental particle theory in1973.

In the '70s, Jackson focused on high-energy particle physics, including work at the Fermi National Accelerator Laboratory. In the '80s and early '90s she worked on a wide array of physics including energy superlattices, superconductors, neutrino research, quantum physics, and opto-electronic materials, preparing or collaborating on over 100 scientific articles.

From 1991 to 1995, Dr. Jackson was professor of physics at Rutgers University, where she taught undergraduate and graduate students, conducted research on the electronic and optical properties of two-dimensional systems, and supervised Ph.D. candidates. She concurrently served as a consultant in semiconductor theory to AT&T Bell Laboratories

By the mid-'90s Jackson increasingly became affiliated with politics and nuclear policy. In 1995 President Bill Clinton appointed Dr. Jackson to serve as Chairperson of the U.S. Nuclear Regulatory Commission (NRC), continuing until 1999. As Chairperson, she was the principal executive officer of and the official spokesperson for the NRC. While in this role, Jackson worked with a number of world organizations and served as a liaison between our nation and others for nuclear issues, including the International Atomic Energy Agency. Jackson served 10 years as a member of the New Jersey Commission on Science and Technology, appointed by the governor.

Jackson holds an amazing 40 honorary doctoral degrees, including at Harvard University, and holds more awards than I could reasonably list here. She was inducted into the National Women’s Hall of Fame in 1998 for her significant and profound contributions as a distinguished scientist and advocate for education, science, and public policy. She was inducted into the Women in Technology International Foundation Hall of Fame (WITI) in June 2000. WITI recognizes women technologists and scientists whose achievements are exceptional.

Since 1999, Shirley Jackson has served as the 18th president of Rensselaer Polytechnic Institute, in Troy, New York. Dr. Jackson is married to Dr. Morris A. Washington, also a physicist. They have one son, Alan, a graduate of Dartmouth College.

Wednesday, February 14, 2007

Romeo & Juliet, Neolithic-Style

Happy Valentine's Day!

Love is grand. It is the time of year for us to dwell on those significant others in our life and to shell out large sums of money on flowers, chocolate, and expensive shiny things. My kids made some great little cut-outs of hearts with stickers on them for us at daycare yesterday, but I don't think my lovely wife would care for such a gift from me, so I'm getting something a little more expensive to express my undying love for her. Of course, my intentions are partly selfish. I'm expecting chocolate. But I don't expect it to stop there – hubba hubba.

I invite you to read about the history of Valentine's Day, starting with Saint Valentine and how, prior to being put to death by the Roman emperor Claudius, he sent the first Valentine's letter to his beloved, the daughter of his jailor!


Valentine's Day has been celebrated since the late 1300's:


Well, as you are wondering about the best way to express your love, roaming through the mall to find the right little gift, or perusing the Hallmark isle, think about a more eternal sort of love expression: dying together and being buried in a loving pose, arms intertwined, staring into each other's eyes, so you can be found up to 6,000 years later in such a condition:


Another article: http://www.physorg.com/news90589132.html

Yes, my little lovebirds, how could you possibly top that? It isn't known yet how this young couple died. Disease? Catastrophe? Apparently it wasn't uncommon in those Neolithic times in that region that, if the man died, his wife would be put to death and buried with him. Lovely! But there is no evidence that this is what happened in this case. For today, at least, we will assume a more pleasant ending, eh? DNA testing will determine if they were related, and further analysis of the bones may reveal cause of death. Buried along with them were some arrowheads and a small knife.

Where else would you expect such a romantic burial but Italy? Along with France, Italy is often considered one of the most romantic of nations. The dead couple were found near Mantua, Italy, which just happens to be only 25 miles south of Verona, setting for Shakespeare's "Romeo & Juliet"!

The archaeologists who are excavating the site will remove the pair in one, large, undisturbed form, preserving them in this unique burial position so that they may remain models of undying love.


Tuesday, February 13, 2007

My Company's Damned Annual Review Process

WARNING: The following diatribe may cause serious damage to the frontal cortex, morale, and general will to live for corporate employees. Do not continue if you have a history of heart conditions, strokes, hypertension, depression, irritable bowel, peptic ulcers, distemper, athlete's foot, or hangnails. If during the reading of this post you experience heart fluctuations, spastic jerking of facial muscles, a feeling of anger and/or suicidal thoughts, aneurisms, or erectile dysfunction, stop reading immediately and seek medical help. Rare but potential long-term conditions may include Tourette's Syndrome and mild dementia.

I'm smack in the middle of my company's f*cking annual review period. By "annual review period" I mean that SIX MONTH period of time when all of the big talking heads in the company decide the fate of my career over the next year, during which I have absolutely no voice other than what my immediate boss has to say about me. Hopefully I've impressed enough people with my amazing powers of innovation, or kissed enough ass, that they have a favorable impression of me and will recommend me for promotion. Unfortunately, I'm not into kissing buttocks and stroking egos, so that leaves me only the "amazing powers of innovation" part.

That's right. Six months. The process began in November with me filling out a five-page online form where I have to describe how great I am, how I've met my goals from last year, little essays about my strengths and weaknesses, and rating my own performance. The process will end in April when my supervisor hands me a page of paper that says whether or not I receive a promotion and/or pay raise (gollum!).

Why does it take so long? Good question. But then it's as inefficient as a lot of the decision-making going on here.

In the months in-between, that form that I filled out was passed to my supervisor, who added his own comments, changed the ratings as he saw fit, and decided for himself if I met the goals. Of course the goals are meaningless, since they were written down over a year ago and the company and its programs have taken a 180-degree turn since then, as they do every 6-12 months. And my current supervisor isn't the one I had then. But that didn't stop us from writing new goals and pretending we wrote them together last year.

Unfortunately that long form that we worked on and the goals that I wrote down and debated about with my supervisor are meaningless for the decision to promote me; only I and my supervisor are likely to ever read it. So why spend days on it at all, I ask? Every year it's the same circle jerk. Basically it serves no other purpose than to be a mechanism for us to sit down and for him to toss me comments both good and "constructive," but if he and I have a good, working relationship, like we are supposed to, why bother? There's nothing he said in that meeting that he and I haven't already said before. But bosses have to be "constructive" about something in such meetings, and since I am a good employee, I got critiqued not on results or projects, but about how some third party thought I meant one thing when I really meant another, how I could have worded an email to be more politically-correct, and how I could work on smiling more. Meanwhile the hour I spent listening to this could have been spent doing the experiments that I have to do. Now I'll have to work late. I'll try to remember to smile more as I come in late tonight to finish my work.

In the coming month I am supposed to meet with my supervisor again to decide my goals for the next year. These goals are supposed to be in line with the corporate goals which were handed down to us in spreadsheet form last Friday, along with an hour-long pep talk. The goal spreadsheet is eight pages long. Reading through the corporate goals is an interested exercise in interpreting "corporate speak," populated with curious acronyms and abbreviations, inspirational catch phrases, and business numbers in hundreds of millions of dollars by quarter. Nice. You can tell it was written by folks at the very top who have little understanding of what rational goals mean for lab rats working at the benches. To the upper management: Just tell me what f*cking projects you want me to do and I'll make the company another million dollars. Other than that, don't bother me.

Now, let me say that both my supervisor and his boss, the R&D director, are good, sincere people. I honestly believe they are trying to help both me and my company succeed. But what we are working with here is an annual review process which is as efficient as paddle-boating in a hurricane, and just as meaningless. It needs to change in a big way. And only the big talking heads are in a position to change anything. I suggested to my boss that he please pass on to his bosses some suggestions from me, and I told him he could feel free to mention my name and that I'd be happy to talk with anyone about it. (Yes, I know what you're thinking. Why the hell can't I just fly under the radar like everyone else and bear the pain!).

What were my suggestions for him to pass on? 1) Let's pare down the process to, say, three months, 2) Let's either lose the meaningless form or have it actually count for something, and 3) Let's devise goals that actually mean something and are flexible enough to account for the constant rate of change at my company. Maybe instead of formulating them for a year, we can review them every six months.

My boss just smiled. Somehow I don't think my suggestions are going anywhere.

To my tens of readers: I'm almost afraid to ask, but I have a strange sado-masochistic twinge: What is your company's annual review process like? Is my evil global biotech company alone in this dysfunction, or is this drooling behemoth the industry standard?

Monday, February 12, 2007

Sleeping At Work

Being the parent of two children in diapers, plus working full time as a lab rat, is a hell of a tiring lifestyle. In the typical day I get home around 6:00PM or 6:30PM and immediately launch into "daddy mode", which doesn't stop until the kids go to sleep around 9:00PM. By then all I want to do is crash in front of the TV, but there are always household chores to do. And I am a busy body type with lots of hobbies, so in order to get those things done, too, I wind up staying up until about 1AM. Add to this the fact that the kids still tend to wake up once or twice a night, and if they are sick or teething they wake up a lot more. Then of course I have to wake up in time to get to work (which, luckily, doesn't require me there until 9AM), and every other morning it's my turn to get up with the kids, sometime between 6:30AM and 7:30AM. So I only get about 5 or 6 hours of interrupted sleep on a good night. I'm only half-complaining, since this is a sleep schedule I bring upon myself (what with the hobbies and all and my choice to be a parent, knowing what I was getting into). At least it's not as bad as when the kids were newborns.

So sleep is at a premium, and many days I drag myself into work in a haze and daydream all day about falling into my bed. My wife is home with the kids many days, and has the option sometimes of napping when the kids nap, in the early afternoon. I envy her a bit for that (though being at home with the kids is usually more physically tiring than my work).

So I revel in stories about people sleeping at work. I can think of a couple people at my work who catch a cat nap in their car over lunch. The problem with that plan is that the car may be screaming hot in the summer or frigid in the winter. I've also read about someone who would sleep in the restroom while sitting on the toilet. No one is going to bother you there, but of course this has its obvious downsides! I haven't quite heard of anyone going to the extreme of George Castanza in "Seinfeld", where in one episode he would sleep under his desk, complete with little shelves, pillow, and alarm clock! There is actually a book entitled The Art of Napping at Work, on the subject. Personally, I've only napped a couple of times at work, when I was sick, by putting my head on my desk and closing my door (I'm lucky I have an office to do this!).

There are a couple companies out there that actually encourage employees to take naps in special nap rooms, believing (oddly enough) that a well-rested employee is more productive. What a notion! It's expected in many countries around the world, like around the Mediterranean, but here in the U.S. it is considered a sign of laziness.

Well, now a study shows that working men, and possibly women too, who nap during the day are not only more rested, but are less likely to die an early death from heart attack:


The Greek researchers followed 23,681 healthy Greek adults for six years (siestas are considered normal in Greece). Those who napped at least 3 times a week for about a half hour – get this! – were a whopping 37% less likely to die from heart attack! Wow. Given I have a mild heart condition and that serious heart and circulatory problems run in my family, maybe I should start taking daily siestas, eh?

One possible problem with the study, though, is that highly-stressed individuals (who are therefore more prone to heart attack) may not make time to nap, and less-stressed individuals do, thus skewing the data.

I wonder what my boss would think about me taking a daily nap?

And could I get a note from my doctor mandating that I put a cot in my office and sleep on the job?

Sunday, February 11, 2007

Reading My Intentions

Quick! What am I thinking?

You can't read my mind? So far no one can truly do that, but science is getting closer all the time.

Think back to the last action movie you watched. There's a pretty good chance that the climactic scene involved the bad guy putting a gun to some helpless blonde female's head or a finger on a red-pulsing doomsday button. You and the hero of the movie are left wondering, "Is he really going to do it?"

A researcher named John-Dylan Haynes from the Max Planck Institute for Human Cognitive and Brain Sciences now has a way to find out, with a success rate of 70%. This was made possible by a new combination of functional magnetic resonance imaging and sophisticated computer algorithms (Current Biology, 20th February 2007, online: 8th February). Of course, you'd have to put the bad guy's head in an MRI machine….

Story: http://www.physorg.com/news90164161.html

In short, while being imaged using the MRI technique, test subjects were told they would be shown two numbers and had to decide beforehand if they would add them together or subtract one from the other, then the numbers were shown to them. In that moment of decision before the numbers were shown, results showed that certain areas of the brain were being utilized in different ways, making it possible to predict what the test subjects' intentions were. The authors point out the potential uses of this technology, such as helping guide artificial limbs or computer cursors for the handicapped.

Now if only this could be used in more mundane situations. For instance, tonight I was eating Haagen-Dazs ice cream while watching a movie ("Lord of the Rings – The Two Towers"), and my wife might have been wondering, "Is he going to hog the entire container again?" Important decisions were on the line, such as leaving me in chocolaty ice cream bliss to enjoy seeing orcs hacked to pieces, or to yell, "Hey, lard-butt, stop bogarting the ice cream!" As it was, being the wonderful caring person she is, she left me in bliss, no doubt trusting me to leave some for her.

I ate it all. Call me a glutton. Mmmm. Mayan chocolate flavor!

What? I had every _intention_ of sharing….

Thursday, February 8, 2007

African-American Scientists: Guion S. Bluford, Jr.

Given the recent news of out-of-control astronauts, let's take a moment to revisit the heroic and accomplished record of the others. For instance, today, astronaut Michael Lopez-Alegria broke the U.S. record for most time performing space walks as he performed some maintenance work on the International Space Station (CNN Story).

In continuing honor of Black History Month, this Thursday's tribute is for another astronaut of great accomplishment: Guion S. Bluford, Jr., known as "Guy". Bluford, who is now age 65, is an aerospace engineer, a retired colonel in the U.S. Air Force, retired astronaut with NASA's space shuttle program, and the first African-American to go into space.

His Wikipedia biography (which has a link to his NASA bio): http://en.wikipedia.org/wiki/Guion_S._Bluford

Guy was born in Philadelphia, PA, in 1942. I wonder about his first name, Guion. It leaves me scratching my head about the exact pronunciation, but I have to say it rates pretty high on my name-ranking system. Given his nickname is "Guy" I would conclude it is pronounced "guy-on." I couldn't find any information about his youth, except that he was an Eagle Scout with the Boy Scouts of America. Bluford received his BS in aerospace engineering from Penn State in 1964, then attended pilot training, earning his wings in 1966. He was promptly shipped off to Vietnam where he flew 144 missions, 65 of which were over North Vietnam, in less than two years. He soon returned to the U.S. where he became a flight trainer and executive.

Always busy achieving, Bluford found time to earn an MS in aerospace engineering from the Air Force Institute of Technology in 1974, then his PhD in aerospace engineering with a minor in laser physics from the Air Force Institute of Technology in 1978. Apparently his scientific research in his pre-astronaut years revolved around computational fluid dynamics regarding air flow around wing designs for planes and missiles and missle thrust vectoring. He has published at least three papers on these topics, but most of his work was for the military and not shared with the academic world.

Within a year of earning his Ph.D., Bluford became an astronaut with NASA (though he wasn't the first African-American accepted as an astronaut, that honor goes to Maj. Robert Lawrence, Jr., who died in a plane crash prior to going into space). Bluford became the first African-American in space onboard the Challenger in 1983 (the first mission to launch and land at night). He served on three additional space shuttle flights between then and 1992. He was a specialist operating the robotic arm (remote manipulator system), worked with avionics systems, was a key figure on Spacelab experiments, and dealt with payload safety issues. He has logged 688 hours in space. He was inducted into the International Space Hall of Fame in 1997.

Of interesting note is Bluford's association with the Challenger Flag, a U.S. Capitol flag that was given to a Boy Scout troop, then flown on the last Challenger mission. After the flag was recovered (undamaged!) from the remains of the Challenger, Bluford (being an Eagle Scout and astronaut) was the emissary who returned the Challenger flag to Boy Scout Troop 514 of Monument, Colorado in December, 1986. On December 18 of that year, he presented the flag to the troop in a special ceremony at Falcon Air Force Base. The flag has since been honored at a number of ceremonies, including the Winter Olympics at Salt Lake City.

Bluford left NASA in 1993 to take a position as Vice President/General Manager of the Engineering Services Division of a company called NYMA inc., in Brook Park, Ohio. I couldn't find a home page for the company, but they do some sort of engineering contracts for the Department of Defense.

On his NASA page, Bluford's hobbies are listed as reading, swimming, jogging, racquetball, handball, and scuba, but in his own words, during his astronaut years: "The job is so fantastic, you don’t need a hobby. The hobby is going to work."

Wednesday, February 7, 2007

Houston, We Have A Problem

Love is grand. In this month of Valentine's Day, we are reminded of the joy and wonder of falling inextricably and overwhelmingly in love and pursuing your lover until you can lock eyes and arms and other flailing body parts in hot passion.

Oh, and let's not forget love triangles. Yes, young cupids, perchance to dream of unrequited love as your obsession goes unrewarded and the beau or belle of your dreams loves another. And when you can't win the heart of your dream man (or woman), well, all you can do is strap on a diaper, drive 900 miles, and try to murder the competition:


It's been all over the news: A hunky shuttle commander, Navy Cmdr. Bill Oefelein, is in an affair with one of his support crew, Air Force Capt. Colleen Shipman. But fellow shuttle astronaut Navy Cpt Lisa Nowak had a crush on Oefelein. When Nowak failed to win Oefelein's heart, she decided Shipman had to go. So she stalked Shipman for two months. Then, a couple days ago, she strapped on a diaper (so she wouldn't have to stop for bathroom breaks!) and drove 900 miles from Houston to the Orlando Airport where Shipman was getting off a plane. When Shipman made it to her car in the parking lot, Nowak assaulted her with pepper spray, but because Nowak waited until Shipman was actually inside the car, Shipman got away and alerted the police. Nowak was caught with a four-inch knife, a metal mallet, tubing of some sort, and $600 in cash.

Now, please keep in mind that Nowak was considered a shining star in the astronaut corps, where only the best and brightest military scientists are even considered. If she's so intelligent, why the hell would she botch the attack so badly? If she had a knife and a friggin' metal mallet, wouldn't you think they would be better weapons? Thank goodness criminals are stupid.

Can you imagine if this had happened while they were in space? NASA put out a statement today that Nowak is grounded for 30 days. No sh*t. Aw, come on, let 'er fly. 30 days is TOO LONG! What could be the harm? "In space, no one can hear you scream!"

Already people are making apologetic statements on Nowak's behalf, probably because, as an astronaut, she's supposed to be a "hero". The most common I've read is that she was unstable due to the stresses of her occupation and recent shuttle missions. Some say it is a temporary insanity sort of thing. Some people are even suggesting there is a physical ailment, like a brain tumor. But let's face it, astronauts are people too, and just as given to the usual darker side of being human that plagues the rest of society. No amount of psych testing can catch everything.

You know, I'd bet a goodly sum of cash that someone out there is already pitching a made-for-TV movie on this. Can't you just see it? A love triangle with an insane astronaut. Stalking. Murder attempts with pepper spray and metal mallets. I can see the pitch statement: "When astronauts will murder for love, they'll pee their pants to shoot for the star in their heart!"

Just makes you feel warm and fuzzy, don't it?

Tuesday, February 6, 2007

Suicide Genes And My Conspiratorial Mind

There are some DNA sequences so dangerous that they don't exist. Call them "suicide genes":


No, I'm not talking about some genetic propensity that, if you have these genes, will make you want to take a swan dive off the Grand Canyon. I'm talking about genes that are so lethal that no organism can survive if they have them. Most likely their protein products bound some essential part of cellular mechanism. A researcher at Boise State University has created a computer program to calculate all the possible nucleotide sequences possible in the human genome. So far, he has found 86 sequences of 11 nucleotides in length which have never been reported. The thought is that random mutation over the eons may have created these genes, but because they killed the developing embryo before birth, the mutation didn't "stick" in our species. Of course, the absence of these genes from the modern human genome may not mean the developing embryos with these genes died before birth, it just means the individuals didn't live to reproduction. How do we know, I ask? Maybe the "suicide genes" just make someone so butt ugly no one would mate with them! I could think of one or two individuals these researchers forgot to screen.

Now, calling them "suicide genes" is really very hypothetical. The only way to tell if they would cause a cell to die is to transfect the sequences into the DNA of living cells or organisms (humans?), which he hasn't done to my knowledge. But should we? For that matter, I ask, is it _ethical_ to explore this issue?

The Department of Defense has gotten involved by funding the research (a million bucks, no less). They say it is "to develop a DNA 'safety tag' that could be added to voluntary DNA reference samples in criminal cases to distinguish them from forensic samples." That would be nice, but wouldn't careful labeling of tubes be sufficient?

No, my conspiratorial little mind thinks they have other plans. Since when is the DoD interested in altruism? Imagine if you could somehow insert these "suicide genes" into living humans. Modern medicine is closing in on viral therapies which could do the trick. Just inhale some viruses that can transfect your cells. Better yet, attach sequences to the suicide genes that activate them only under certain conditions. Murder on command!

Do I hear a mad scientist in the wings? Mwa ha ha ha….

Sunday, February 4, 2007

Lusi In The Sky With Mud

Once again someone has done something really, really stupid to upset the Earth. No, I'm not talking about the global warming report just put out by the U.N. that showed the phenomenon is man-made, I'm talking about some morons who went drilling for natural gas and wound up making a gigantic mud volcano.

Here's a news report: http://www.physorg.com/news89653591.html

And here's the Geological Society of America journal article from February's issue: HERE

The morons in question were an Indonesia gas-drilling company called Lapindo Brantas. Back in May 2006 they were boring a hole in the ground near Surbaya, East Java, to look for gas and didn't line the sides of their hole with steel casing like they should have, which would prevent against unexpected expulsion of pressurized water. As a result, they drilled through an area of limestone, exposing an underwater aquifer, and up the water went. Only the aquifer wasn't just water, it was mud. Lapindo Brantas tried to blame the accident on an earthquake that happened days before, but the study reported in the journal of the GSA proves it was man-made.

I can imagine the conversation that led to the accident:

Bore-hole guy: "Hey, Foreman, should I put some steel casing down the bore hole? You know, like we are required to."

Lapindo Foreman: "Naw, Bambang, what's the worst that could happen? A little mud? Let's save some time and money and just dig without it. Get to work!"

What resulted was a mud volcano – a giant cone of constantly-spewing mud. This one has come to be named "Lusi". Ten square kilometers around the volcano are now uninhabitable and 13,000 people are permanently displaced. Four villages were completely erased, along with 25 factories, and 13 people died when a gas pipeline collapsed. The mud flow will likely continue for years, and there will probably be a collapse of earth around it resulting in the formation of a caldera.

How do you say "Oops" in Indonesian?

Maybe some good could come of this, eh? The power of positive thinking! Why, this could be the world's largest mud bath. They could make a spa for tourists. No? How about an SUV off-roading mud race? Yeah, that's the ticket. Those crazy Javanese and their monster trucks….

Okay, fine, it's a horrible catastrophe with no possible golden lining that resulted from someone's stupidity. Go figure.

UPDATE: http://angrylabrat.blogspot.com/2007/02/plugging-volcano.html

Friday, February 2, 2007

Sunlight, DNA Mutation, And My Face Melting Off

Man, I hate sunburns. My paternal grandmother was deathly afraid of sun damage, skin cancer in particular, so much so that she often carried an umbrella (she called it a parasol) outside with her, even just to sit in the yard for a few minutes. Interestingly, her fears came true. She developed a few skin polyps on her nose in her last years, despite all the umbrella-toting.

My grandmother's umbrella came screaming back to my consciousness a few years ago when I burned my face off. I went downhill skiing the first (and only) time – and didn't bring sunblock. "No problem," I thought, "I'll just be here a couple hours." Oh, woe is me! How very na├»ve I was of the dangers of high-altitude UV radiation and snow reflection! I was there for several hours more than I thought I would be, enjoying a newfound excitement of speeding down snowy hillsides, avoiding other skiers, and tumbling onto my butt and through the air in exotic poses of flailing arms, skis, and ski poles. Great fun. I'll do it again some day.

The next morning I noticed a strange moisture seeping out of my cheeks. It was clear liquid. Was I sweating? I didn't know what to think of it, so I taped some gauze on my cheek and went to work anyhow. Once at work, I busied myself as usual. After an hour or so I noticed the gauze was soaked, so I went to the bathroom to re-evaluate my strange condition. When I took the tape off, part of my cheek went with it! And, boy, did the liquid start coming out! I was positively dripping!

Long story short, I rushed to a medical clinic (and sat for hours in their stupid waiting room) and found out I had a second-degree sunburn. The top layer of skin hadn't bothered turning red first, it just said, "Goodbye cruel sunlit world" and died. My pasty face melted off over the course of about a week. "Peeled off" would be too kind of a phrase. I didn't have enough leave time left at work, so I had to work during that time, my face covered with bandages. Not a good situation. Needless to say, I wasn't very popular with coworkers. The thought running through my mind that whole time was, "Am I going to get skin cancer from this rather severe amount of UV radiation?"

Well, researchers at Ohio State University and their colleagues in Germany have now been able to directly observe UV radiation mutating DNA:


It's reported in the latest addition of _Science_: http://www.sciencemag.org/cgi/content/abstract/315/5812/625

It's been known for a long time that exposure to UV light (like in sunlight) can cause skin cancer because the energy level from the intense UV radiation causes mutations in the DNA, leading to uncontrolled cell growth (cancer) or cell death (in the case of a sunburn or, more extreme, your friggin' face melting off!). Femptosecond transient absorption spectroscopy is a very new method for visualizing extremely fast reactions at the molecular level. In short, this method uses lasers to excite molecules and an extremely fast light pulse and detector are used to look at changes in the energy levels of the molecule changed by the laser. Using this method, the researchers directly observed how UV radiation caused the formation of two chemical bonds between thymine molecules in the DNA structure. This was a pretty artificial test sample (artificial thymine-only DNA), and not in actual cells, and didn't involve all the chemical pathways involved with sunburn, but it DOES show for the first time actual DNA mutation from UV radiation, which is a huge step in understanding such mutations.

The authors probably don't go so far, but I'm wondering, could similar, non-deadly mutations happen that could lead to genetic changes in the morphology of bacteria or simple-celled organisms, perhaps even changes that could lead to development of new species adaptations, leading toward evolution?

Anyhow, this all makes me even more concerned about skin cancer and my little bout with my skiing incident!

Thursday, February 1, 2007

African-American Scientists: George Washington Carver

February is Black History Month. To celebrate this, I am going to feature an African American Scientist every Thursday this month.

My first choice is probably the most historically famous: George Washington Carver, a chemist, food scientist, botanist, and agriculturist.

Here is the Wikipedia biography:


Carver was born into slavery in what is now known as Diamond, Missouri, most likely in 1864. His slave owner was a German immigrant named Moses Carver, who traded for George. As George Carver is later quoted, "When I was a child, my owner saw what he considered to be a good business deal and immediately accepted it. He traded me off for a horse." Baby George, his mother Mary, and a sister were later kidnapped from Moses Carver by Confederate raiders. By the time Moses was able to get George back, George had whooping cough, and his mother and sister were most likely dead. Soon slavery was abolished, and Moses Carver and his wife raised George and his brother Jim as their own children and taught them to read and write. Eventually George took his adopted father's last name as his own, by choice. You hear plenty of stories about abusive slave owners, but this makes me think that Moses Carver may have been the exception.

Because of his race, George met with difficulties in attending grade school, but didn't let that stop him from a good education. As Carver is quoted, "Education is the key to unlock the golden door of freedom." He moved from school to school, eventually graduating high school in Minneapolis, Kansas.

Carver also faced the same difficulty getting accepted into colleges, but was eventually successful. He started as the second African-American to attend Simpson College, in Iowa, but eventually moved to what was to become Iowa State College, as their first African-American student. It was there that he started using the middle name Washington, since another student had the same first and last name. After graduation, Carver stayed on as their first African-American Master's student, then faculty member. Carver eventually changed jobs to teach at the five-year old Tuskegee University, where he remained for 47 years until his death in 1943.

If you've ever heard of George Washington Carver, chances are the things you took away from the lesson were that Carver was an early Black scientist and that he invented all sorts of wonderful and exotic uses for the lowly peanut. Why peanuts? Because all the cotton farming down South had depleted nitrogen from the fields. Peanuts and other legumes replenished the nitrogen, so Carver encouraged farmers to plant them. But there were only so many uses for the plant (salted peanuts, anyone?). So to help all those farmers that had planted a nearly unmarketable crop, he invented lots of recipes and products for them and helped market them.

What was he reputed to have invented? Carver developed between one-hundred and three-hundred applications for peanuts and 118 for sweet potatoes, (http://www.npg.si.edu/edu/brush/guide/unit2/carver.html) including bleach, metal polish, paper, plastic, glue for postage stamps, printer's ink, plant milk, cooking oils, flour, instant coffee, mayonnaise, meat tenderizer, cheese, dyes, shaving cream, shoe polish, synthetic rubber, talcum powder, wood stains, varnish, soap, vinegar and cooking sauces. He made similar investigations into uses for cowpeas, soybeans and pecans. Also he authored three patents (one for cosmetics, and two for paints and stains). Now, I'd like to know exactly how a frickin' peanut can be turned into some of these products, but apparently he made it happen – and therein lies his talent. He also invented a form of peanut butter, but don't start thinking Jiff or Peter Pan peanut butter. It was more like the oily, gritty, unsugared organic sh*t you get at health stores. My wife buys that stuff to feed to our kids. I refuse to eat it. But apparently his was good enough to launch the invention.

Unfortunately, Carver was not a model scientist in terms of his practice. For one thing, he didn't keep a lab book, and kept all his recipes in his head, refusing to write them down. That means almost none of his inventions can be repeated and are therefore lost to time. Pretty sad. He wouldn't write down lists of inventions, either, which is why there is confusion about exactly how many he came up with. He also claimed God gave him his ideas for plant products. He hated teaching and was very bad at administrative work, preferring to dedicate himself to research (which he was eventually able to do). He had his own 2-room lab, much to the jealousy of other faculty, and lived (get this!) on the second floor of a woman's dormitory, accessing his room via a fire escape. He partnered with presidents and captains of industry to develop a number of novel uses, but almost never sought to capitalize off of his endeavors, often giving his advice and expertise freely. No one can say exactly how many of our plant-derived products came from his inspiration.

Carver died at age 76 after a fall down some stairs. He willed his entire savings to Tuskegee University, founding a fund in his name. He has since become an icon of American culture, a symbol of early African-American triumph over slavery and discrimination, and a pioneer of American science.

"Most people search high and wide for the keys to success. If they only knew, the key to their dreams lie within." -- George Washington Carver