|
||||
Daily Notebook
Copyright 2005 Michael A. Covington. Caching by search engines is explicitly permitted.
If you don't see what you came here for, please scroll down |
|
|
|||
2005 February 27-28 |
Windows' built-in alarm clock
Want your computer to get your attention at a particular time? Here's one way to do it. In Control Panel, go to Scheduled Tasks and create a scheduled task. When asked what application to run, hit Browse and choose a music file. Then the music will play at the time you've specified. The reason Scheduled Tasks asks you for your password is that Windows is a multi-user operating system, and the task has to "log on" for itself. Dubious academic achievement of the day: Indian student's fake award from NASA with the name of NASA's director dramatically misspelled, and other implausible details.
|
|||
2005 February 26 |
Good old G. K.
Yesterday's highlight was a lecture to the Christian Faculty Forum about G. K. Chesterton by Dale Ahlquist of the American Chesterton Society. If you haven't read G. K. Chesterton, you've missed a treat. Sample some quotations here. Chesterton (1874-1936) was a very influential writer in his own time, but somehow, he got lost in the shuffle after World War II and is no longer widely read. He was a humorist, a popular philosopher, a political and religious thinker, and a writer of detective stories. Some key Chesterton ideas (these are not quotes):
All of this understates Chesterton's wit. He was ceaselessly entertaining and could easily have upstaged anyone on The Tonight Show if it had existed in his time. He was every inch a gentleman. In his case, that was a lot of inches. "I am the politest man in England," he said, "I can stand up on a bus and offer my seat to three ladies."
|
|||
2005 February 25 |
A cure for the common cold
With all its marvels, why can't medical science do something about the common cold? Well, it turns out it can - and it's something that could have been done 150 years ago. No, this isn't quackery; it's something that has been recommended to us by at least four doctors, and we've been using it for about a year with great success. Simply wash the germs out of the body with warm salt water. This is called nasal irrigation and can be done with any of several kinds of pumps, including the Grossan Hydro Pulse (pictured) or even an adjustable Water Pik. More information from a medical perspective here. Note: Our Grossan Hydro-Pulse did not hold up well. For repair notes and other information, see December 12, 2006. I have several times had the spooky experience of washing a cold completely away - that is, after a 5-minute nose wash, I no longer have a cold at all. Apparently it is possible to remove nearly all the bacteria and viruses, so that the body's defenses quickly finish off what little is left. There are many recipes for the irrigation solution. I use canning salt (which is purer than table salt and dissolves faster), 1 level teaspoon in 500 mL of water. That's 1% salt, by volume. I also add 1/2 level teaspoon of baking soda. Some people use considerably more salt in order to get a decongestant effect.
|
|||
2005 February 24 |
A legend in his own time
Today the University held a retirement party for Bob Stearns, who has been with us since, I think, 1968, and who has programmed everything from early mainframes to modern web servers. His specialty is hard problems and large systems - anything that involves pushing a computer to its limits. A few years ago, he suffered a stroke, but it didn't stop him from programming; his last large project for the University was the "Bulldog Bucks" charge account system. At different times I have been honored to have him as a friend, mentor, colleague, and (gasp) graduate student. In the picture, you see him being presented with a silver-plated PC keyboard. I don't know what the larger-than-life Bob-as-Uncle-Sam picture is all about!
|
|||
2005 February 23 |
Business is booming, even at Ilford
Did somebody flip a switch on the economy? Business is booming all of a sudden. After a year of near-inactivity, Covington Innovations has 3 (count 'em, 3) typesetting jobs in hand (fortunately all small) as well as a software development job and has turned down a request to repair a piece of scientific equipment. Meanwhile, at the University, I'm catching up with administrivia and plunging into research. And in the photo industry, a piece of very good news: Ilford is out of receivership. Coincidentally, I did darkroom work on Saturday, for the first time in months, and used lots of Ilford paper. Now I'm optimistic about being able to get more of the same kind.
|
|||
2005 February 22 |
Soft drink to avoid
Dasani Lemon. I drank some yesterday and got thirstier and thirstier and lost my voice! Lemon-flavored water would be a good thing. This stuff, however, is water, citric acid, and sucralose. It's almost like drinking gooey vinegar.
|
|||
2005 February 21 |
Ex-teenager
|
|||
2005 February 20 |
Not such a good idea
Yesterday someone pointed out to me an instance of what is unfortunately a common kind of human behavior with computers. To protect privacy, I won't say where this happened. I wasn't directly involved in it, and it's not on a publicly visible web site. There is an online forum for people with heart disease. It's moderated, but members do not normally give their full names or locations, even to the moderator. Two nights ago a member posted that she was having chest pains. Nobody saw her message for about an hour. Even then, nobody knew her name or location, so it was impossible to send an ambulance. Fortunately, she survived. But why did she choose to use an anonymous online forum to seek help with a life-threatening symptom? Not such a good idea. But people do such things with some regularity in all types of online medical forums. Why?
|
|||
2005 February 19 |
End of the film era?
The latest issue of Photo Techniques reports that digital cameras beat film on noise and grain at almost all ISO speeds. (What everybody forgot, when condemning digital sensors for their imperfections, is that film isn't perfect either.) Meanwhile, at the PMA show in Florida, Canon has announced a new, improved Digital Rebel. This is in addition to their astronomical 20Da recently announced in Japan. I suspect that in three years, film will have almost disappeared, except as an exotic art material.
|
|||
2005 February 18 |
The February flurry
Mid-February is when people and projects that have been slumbering since Thanksgiving suddenly wake up and start moving again. I'm far too busy to do anything but remark on this! Other recurrent flurry dates are April 15, June 15, and September 15. I don't entirely know why.
|
|||
2005 February 17 |
Tally-ho!
Today is the last day that fox-hunting is legal in England. After decades of debate, it has been banned, though there is some doubt whether the law will be enforceable. I can't say I disagree with the ban. English fox-hunters don't shoot the fox, or trap it, or anything sensible like that. They run it to death with a pack of hounds. If it doesn't die of exhaustion, it is torn apart by the dogs. More about the debate here. Software success: JTB Communications is about to begin a marketing push for TIP, which I programmed to their specifications. During a year of testing with about a dozen real customers, it has proved very reliable. In a couple of places I made a mistake and implemented the wrong thing, or implemented something inconsistent, but the program itself has never crashed or lost data. I attribute this partly to using Microsoft .NET Framework and partly to not trying to be too clever. Both of these decisions are decidedly uncool among the "hacker" crowd (and I mean "hacker" in the good sense). But the world doesn't want cleverness; the world wants reliability.
|
|||
2005 February 16 |
Death of the Net, again
The imminent death of Usenet has become a topic of discussion on Slashdot. "Imminent death of the Net predicted" has been a joke since the 1980s. I think it reflects the correct insight that Usenet was not sustainable in the long run. I'm amazed it lasted as long as it did. Usenet was based on the assumption that communication between sites was much more expensive than communication within sites (typically university campuses, later ISPs). Not any more. Today everyone who wants to participate in a discussion can easily connect to a specific Web server from wherever they are. A forum can be hosted anywhere. Usenet was also based on the assumption that every site provided some security. You could be sure that everyone posting from yale.edu was known to the Yale computer center, and that action could be taken if someone became too obnoxious. That ceased to be the case when "public access" sites first appeared around 1990, followed by America Online (the notorious aol.com) in, I think, 1993, and then other commercial ISPs. Now any idiot can get on Usenet, and most of them do. Before there was spam e-mail, there was spam on Usenet. The strength and the weakness of Usenet are its utter lack of central control. Most newsgroups aren't moderated; that is, no one controls them. Because messages are shared among a huge number of servers, there's no point at which to exert any kind of control. The Net is censor-proof - but then, so is the World Wide Web. The big difference is that everything on the Web is in some sense identifiable. That doesn't mean there is no anonymity. You can still be anonymous on the Web if some non-anonymous person is willing to cooperate with you and take some responsibility. Anyone can run a forum and let anyone post messages in it - and also stop them! Right now Google Groups provides public access to Usenet, as well as a huge searchable collection of Usenet postings dating back more than 15 years. Google would like to give the public the impression that they own Usenet, while in fact they are only one node in it. But all the other nodes are starting to die out, and eventually they may be the only one left. Why does the human brain have cannabis receptors? Were our bodies designed for marijuana? Of course not. But the reason cannabis has a strong effect on the brain is that it happens to match one of the brain's internal chemical systems. The receptors that it matches are called cannabinoid receptors. Their role is somewhat obscure. There are also opioid receptors, which regulate sensitivity to pain. Again, they weren't designed for opium - they were designed to respond to some of the brain's internal chemicals, which opium happens to resemble.
|
|||
2005 February 15 |
Textbook prices
Our university has finally expressed concern about the high cost of college textbooks, and I've had my say in the campus newspaper here. (Backup copy here.) Anti-cannabis Here (until Feb. 28 only) is an article in Journal of Psychopharmacology about an emerging class of drugs that block the receptors in the brain that cannabis (marijuana) stimulates. Besides blocking marijuana intoxication (which itself is a useful thing to be able to do, since the effects of marijuana linger for hours to days), these drugs also reduce the appetite for food and for addictive drugs such as tobacco. This raises a number of interesting possibilities. Full reference: David Nutt, "Cannabis antagonists: a new era in social psychopharmacology?" Journal of Psychopharmacology, Vol. 19, No. 1, Pp. 3-4 (2005). By the way, it is well known, as the article points out, that marijuana makes schizophrenia worse and that marijuana can be addictive. I get tired of dealing with people who mindlessly recite the mantra "marijuana never killed anyone." It's false - marijuana contributed to two suicides of people I knew personally! And getting schizophrenia may be worse than being killed. (Also, a point of logic: If marijuana never killed anyone, how would you know? Do you continuously survey all the deaths in the entire world in order to be sure that none of them ever had this particular cause?) Speaking of addictions, tobacco farming in Georgia is dying out fast. There's no longer enough money in it.
|
|||
2005 February 15 |
Eventful day yesterday, with enough notes to fill a week. Here goes...
Death of the Net? America OnLine, which was the company that brought hobbyists onto Usenet in 1993 and quickly made aol.com synonymous with "clueless newbie," is now shutting down its Usenet server. Less than 1/20,000 of its members were using it. (More here.) AOL subscribers - and everybody else - can still get to Usenet through Google Groups, which many people think actually runs the network, but no; Google Groups is just an access point and search engine. Usenet was once a network of UNIX systems. When the Internet came along, the best feature of Usenet was preserved and kept the name. Usenet now consists of newsgroups, which are forums, divided up by subject, containing data in the form of e-mail. Newsgroups are designed to minimize data transfer costs. On the Web, you connect directly to the server where a page is hosted, but on Usenet, you view a local copy of each message kept on your institution's or ISP's own server. The messages look and work very much like e-mail. Usenet is very much a relic of the early Internet. With a total lack of control and accountability, it is remarkably free but also easy to sabotage. In recent years, many hobbyist newsgroups have been rendered unusable by "griefers," people whose hobby is making life unpleasant for others. It's easy to get on Usenet under a false name through an ISP that doesn't care. There are also legal risks with running a Usenet server. If you have a server, you actively redistribute the material to your clients and to other servers, so you're responsible for it, but you don't really have any control over what comes in. If it's pornographic or infringes copyright, you could be in hot water. AOL has already had to settle with a copyright owner, and The University of Georgia was, some years ago, embroiled in controversy over pornographic newsgroups, which are often truly obnoxious, together with even hotter controversy over "missing" newsgroups, most of which were missing simply by accident. As if that were not enough, Usenet is where the first spam appeared, even before there was spam e-mail. I think the days of Usenet are numbered. People who want civilized conversation would rather hang out in web-based forums whose members are accountable, such as Yahoo Groups (no relation to Google Groups) or the phpBB forums that people are setting up on web servers everywhere. Delphi decade I missed a major anniversary yesterday. As far as I'm concerned, the modern era of computer programming began with Borland Delphi, ten years ago yesterday, although Visual Basic was an important precursor. Unlike earlier Windows programming environments, VB and Delphi are "contract-free" - the programmer doesn't have to do anything for the operating system. In earlier packages, whenever you created a window, you had to write a lot of routines to handle its events. Not in Delphi (nor Visual Basic). You write code only to handle the events you care about. The rest are automatically intercepted, and they do the default thing, or nothing at all. Thus the user sits in front of the computer and calls subroutines by clicking on buttons, typing things in boxes, and so forth. The program tells the computer how to respond to these actions. Nothing else. Delphi does this more systematically than Visual Basic. Delphi is an object-oriented extension of Pascal, which I still think is an unusually well-designed programming language. Delphi is also the direct ancestor of C#, the language of the Microsoft .NET Framework, thanks to its designer, Anders Hejlsberg, who was hired away by Microsoft. Turbo Pascal was Hejlsberg's first masterpiece; Delphi his second; and C# his third. Super astrocamera Canon has introduced a special astrophotography version of the EOS 20D known as the 20Da, but so far, they only have a web site for it in Japanese. Special features include increased red response (so you can get pictures of nebulae that show all the emission regions) and live focusing (a mode in which the shutter opens and you see, on the LCD, a real-time magnified view of the image falling on the sensor). I have no idea how much it costs, but I want one! You can view the Japanese pages, badly translated by Google's computer, in broken English here. More on DSLR astrophotography here. Preliminary American account of the EOS 20Da here, reportedly not perfectly accurate at present, but it will no doubt be revised. Mexican baroque I've just come across some great music from an unexpected source. Yes, there was baroque choral music being composed in New Spain in the early 1700s. Their best composer, Manuel de Zumaya, sounded a lot like Händel. Listen to samples here.
|
|||
2005 February 14 |
Stupid TV tricks
It's not a very good picture, but here's an example of what I dislike about the local cable TV company. The channel that displays the schedule has slowly been whittled down so that useful information now fills less than 30% of the screen. The rest of the space is commercials - whose closed captioning sometimes overwrites the schedule! They also make the schedule available on the Web, but I have to go through a maze of pop-up ads and enter my location, cable provider, and plan every time. Fortunately I don't watch much TV. It looks like they don't particularly want me to. St. Valentine's Day: Not a big event around here, because with Melody and me, every day is Valentine's Day. And the history of St. Valentine is somewhat obscure. Anyhow, February 14 is traditionally the day on which birds in merrie olde England were thought to begin their springtime courtship. We should have a minute of Beethoven in memory of Karl Haas, host of "Adventures in Good Music," which most of us have enjoyed on National Public Radio. He died earlier this month at the age of 91.
|
|||
2005 February 13 |
Copernicus and a ghost-writer
I've just done a tiny bit of historical research. As everybody knows, Copernicus' famous De Revolutionibus, the book that in 1543 told us the earth goes around the sun, contains a preface saying that the "hypotheses" need not be literally true; we're free to view them as just a computational technique. Copernicus didn't write this preface. Andreas Osiander did, but it was printed without his name. The question is, could a reader have recognized that the style was not Copernicus's? Michael Maestlin (Kepler's teacher) did, but he was an unusually intelligent and erudite man. That is what led to the exposure by Kepler of Osiander's ghostwriting. But in his recent book about De Revolutionibus, Owen Gingerich asks whether other, more ordinary readers could have noticed the difference. I got out my copy of De Revolutionibus and had a look. I think the answer is yes. Copernicus writes very sophisticated Latin, not only in his fancy preface, but even in nuts-and-bolts mathematical exposition. Osiander writes (or rather often lapses into) much simpler Latin, with considerably shorter sentences and fewer subjunctive, gerundive, or participial verb forms. (My copy of De Revolutionibus is a reprint from the 1970s. I don't have an original, and have only once even handled an original - Whewell's copy, at Trinity College, Cambridge.) At last, somebody thinks about how components are placed on ready-made circuit boards. See One Pas Inc.. Entirely too many don't have enough holes connected together in a row to actually build anything. The One Pas people alternate between several different spacing schemes so that many different components will fit. There's a good crop of News of the Weird today, especially the poll that asked whether President Bush is a uniter or a divider - and the results were split 50-50. Note also the mention of pornography for monkeys.
|
|||
2005 February 12 |
Name games
One of the frustrations of using eBay is the fact that everyone uses a made-up account name unrelated to the names they use for other purposes. Now I'm not against businesses being able to make up names for themselves. But when I buy something from AuctionSallyOfTheNorth, and get e-mail from doofus@somewhere.net, and later receive an inquiry whether I mailed my check to Ebenezer Doe of Podunk, Massachusetts, how am I supposed to know that all three are the same? (These, I hasten to add, are made-up examples.) Speaking of names, here's a point of nomenclature that I picked up while reading about Ash Wednesday the other day. And it's something an educated man like me should have known but didn't: The Roman Catholic Church does not call itself the Roman Catholic Church. It calls itself simply the Catholic Church. The name "Roman" was pinned on it by other churches that claim to be "catholic" (universal) in some way, especially the Church of England. At least, regardless of the exact name, they know who they are. I belong to a Baptist church which is rewriting its bylaws about transfers of membership to and from other churches. They've run into a real problem defining "Baptist church." First of all, we don't believe that Baptist churches are the only valid churches. Second, Baptists have no hierarchy; the local congregations are self-governing and may or may not belong to larger organizations. Third, not all the churches of the intended type actually call themselves Baptist.
|
|||
2005 February 11 |
What I do at the office
A small piece of my externally funded research has now been approved for publication, and you can see it here. I want to thank all the co-authors for their contributions; this was not a solo effort. In fact, this project has been my first experience leading a large scientific team, and I think we're doing unusually well. The published paper is just a review of the literature - not any of our own results. It's going to appear in the journal Schizophrenia Research. We are using computers to analyze the speech of people suffering from schizophrenia and other brain disorders. We've found measurable abnormalities on levels ranging from phonetics to discourse organization. Our measurements will be used to evaluate treatments and, later, also for diagnosis. Results of our own experiments will be published later. (We don't do clinical work ourselves; we work with recordings collected by others.) We also hope to release a much fuller version of the literature review, including introductory material for the nonspecialist.
|
|||
2005 February 10 |
Miscellany
If you like good classic detective stories, in the tradition of Sherlock Holmes, don't miss those of Jacques Futrelle (American, despite the French name). Gadget of the day: The Atmel AVR Butterfly. This is a complete computer about the size of a stack of business cards, with an LCD display, a button for input, an Atmel AVR microcontroller, some i/o devices, and the usual support circuitry. It comes with some software already loaded, and Atmel suggests that real geeks can use it as a name tag at conventions. You can program it in assembly language, C, or other languages. More importantly, because of its very low price, I plan to use it for all sorts of things. (As soon as I can get a lot of prior projects finished, of course!) The $20 package includes the Butterfly and everything you need for assembly language programming. (Get your compilers for other languages elsewhere; some are free.) It's much cheaper to buy the whole module than the individual parts. [Atmel photo.]
|
|||
2005 February 9 |
Ash Wednesday
Yesterday was Mardi Gras (Fat Tuesday), and all over Europe, South America, and New Orleans, people were eating, drinking, and making merry. Today is the beginning of Lent, a season of self-discipline observed by Catholic, Orthodox, and Anglican Christians (which doesn't include me, though I have observed Lent to some extent and seldom neglect it totally). Some of them will go to church today and have ashes applied to their foreheads to remind them of their mortality. The observance of Lent in its present form is apparently only a few hundred years old. But it reflects something that evangelical churches are belatedly discovering: people need a cycle of periodic rededication. Lots of historical details (but not up-to-date information) are given here and here. Check your Wisconsin quarters - the Mint inadvertently made three versions of the design, and collectors are paying big bucks for the rarer varieties. (The site I've linked to is the only one I could find with sufficiently good pictures.) I also suggest using a search engine to look for "Wisconsin quarter corn." And get your Windows updates today; they're important. A bit of wholesome entertainment: If you like Sherlock Holmes stories, especially apocryphal ones, look at these, especially "The Defective Detective." Some humor for electrical engineers (and the rest of us) is at RF Cafe.
|
|||
2005 February 8 |
The significance of R. A. Fisher
When, two days ago, I wrote about the importance of statistical significance tests, I failed to name the man who did the most to develop and promote them: R. A. Fisher (Sir Ronald Fisher), many of whose works are now available on line. I was delighted to discover that Fisher was a practicing Christian, who attended chapel in his Cambridge college (Caius) and even sometimes preached there, though he was skeptical about "dogma." (See his Royal Society biographical sketch.) Fisher is better remembered for taking exactly the wrong side in the controversy about whether smoking causes lung cancer. He thought it didn't. His statements may have been admissible in 1958, but they were immediately bowled over by subsequent scientific investigation. On the philosophical side, Fisher defended the notion that "probability" is a way of measuring ignorance - if we say the probability of something is 25%, we mean that it is true in one of four different situations, and we can't tell which of the four we're actually in. This is a bit different from the "frequentist" position that "25% probability" means "if you try it over and over, it will happen 25% of the time" (which is meaningless if you're only going to try it once), or the "subjective probability" notion that "25% probability" means only a quarter of a confident belief, whatever that might be. See this essay for Fisher's explanation. His other mathematical writings are sometimes way over my head, but sometimes entertaining. There's an essay on why the Sieve of Eratosthenes is called a sieve, as well as an amusing wild ride through the notion of probability of a probability. (Suppose you don't know the probability of X, but you do know there's at least a 25% chance that the probability of X is at least 50%...) Fisher's most influential statistics handbooks - almost the only handbooks for a generation of researchers - have been reprinted in one handy volume by Oxford University Press. For some reason Amazon does not have the book in stock, but used copies of the paperback are selling for $200. Me? I got it out of the library.
|
|||
2005 February 7 |
Freud or fraud?
I had known for a long time that Sigmund Freud's psychoanalytic theories were not considered to be very good science. They are in fact hardly used in modern psychiatry. The problem is that Freud imposed interpretations on the cases, and there wasn't much of a way to test whether the interpretations were correct. You just had to "believe," as Peter Pan put it. That, and psychoanalysis did not produce a steady stream of cures the way Freud claimed it would. But I didn't know until just now that Freud had in fact been charged with fabricating case reports. The frauds of Freud are reported in The Great Betrayal: Fraud in Science, by H. F. Judson, which I've just read. (Or rather skimmed; it doesn't really contain enough material to fill a book.) They are based on material that wasn't made public until long after Freud's death. In particular, "Dora," the classic case of hysteria, was not at all as Freud reported. In that case Freud simply ignored undisputed facts in order to make the case fit his theories. And Sergei Pankeev, Freud's famous "wolf man," reported sadly in the 1970s that Freud hadn't done him any good at all - in fact Pankeev spent his life disabled, in and out of institutions, while Freud bragged about the "cure." (I hasten to add that Pankeev dreamed about wolves - he didn't think he was one.)
|
|||
2005 February 6 |
The significance of significance
If asked to name the main scientific advances of the twentieth century, you'd probably mention DNA, the structure of atoms, and maybe the invention of computing machines. But there's another scientific advance that has made all our lives better - sometimes to an enormous extent - and it's something people aren't even told about in high school. I refer to the development of statistical significance tests to extract knowledge from random variation. If you're foggy on what a significance test is, let me run through an example. Suppose you want to know whether a new kind of fertilizer makes corn plants grow taller. So you grow some corn, under identical conditions, with and without the new fertilizer, and you measure a dozen plants from each bunch. And suppose the fertilizer-treated plants are taller. The question is, how much taller do they need to be, to convince you that the increase is due to the fertilizer, rather than just the fact that they're not the same plants? After all, two different groups of corn plants would never have been exactly the same height anyhow. To find out, you can perform a significance test. This is a calculation that takes into account the variability of the heights in both sets, as well as the size of the sample. It tells you the probability that the observed difference is due to a real difference, rather than just sampling variation. Traditionally, if this probability is less than 1/20 (or as statisticians say, P<0.05), it's considered significant, i.e., sufficiently likely to indicate something real. Significance tests entered the world with Karl Pearson's chi-square test in 1892, but they didn't achieve wide acceptance until after the turn of the century. The test I'd actually use, in the example just given, is called a t-test, first published by by William S. Gosset in 1908. Gosset called himself "Student" because his employer wouldn't let him publish under his own name. As you might guess from my example, the first big application of significance tests was in agriculture. Even if you have no idea why something works, you can measure whether it works, and improve your crop production, step by step, a few percent at a time. Indeed, the statistical study of agriculture at the University of North Carolina was a big part of the post-WWII rise of southern academia. At the University of Georgia, we also have a big statistics department, and our computer center was founded by a cattle geneticist. More importantly, significance tests have moved medical research from testimonials to experiments. Notoriously, the human body is self-repairing, and some patients will get better no matter what is done to them. Because of this, no matter how ineffective a treatment may be, you can always find people who will give testimonials about how much it helped them. That is what makes quackery possible. Significance testing makes it possible to tell whether people are getting well because of the treatment or in spite of it. It's also why it is nowadays a good idea to ignore testimonials unless you have enough examples - both good and bad - to indicate the range of variation. Good physicians knew this intuitively all along; modern statistical techniques make it possible to compute significance, not just guess it. The rise of modern statistics has had some disadvantages, too. First, it has spawned a considerable amount of stupid research. Remember that when P<0.05 there is still one chance in twenty that the result is bogus - that it really came from sample variation rather than anything real. Some researchers make dozens and dozens of ill-conceived experiments, and about 1/20 of them come out "significant." Other researchers then have to come along and replicate the experiments to find that the results aren't reproducible. For this reason, I keep drumming into my students that the purpose of statistics is not to find P<0.05, it's to find out how things work. A low P value, out of the blue, is probably bogus. Second, modern society tends to pay too much attention to minor hazards and not enough to big ones, because even small hazards are "significant" when you survey an enormous sample, and people think "significant" means "important" when it really only means "probably not due to sample variation." The main hazards to human life, for most Americans, are automobiles, tobacco, alcohol, and unhealthy eating (obesity and cholesterol). These are what we should focus on if we want people to live longer. But people ignore these big hazards and focus on removing paltry amounts of chemical or radioactive contamination - even when the danger is demonstrably very small - just out of fear of the unknown, and because statisticians say the danger is "significant" (i.e., detectable in a large sample). It's like witch hunts. A mathematician - I don't remember who - suggested that the way to deal with low-level radioactive contamination in a town would be to improve the traffic signals and make sure all the lanes are well marked. That would make the town safer to live in, on the whole, than before the contamination occurred. Not surprisingly, people didn't like the idea. One other note. The harm done by tobacco went unrecognized in the 1950s because so many other health hazards had suddenly been taken away. "We're the healthiest generation that ever lived," people said, and they were right. Their life expectancy of 70 years was something to be proud of. Who knew that smoking was almost the only thing holding them back from 80 or 85? On the history of statistics, see Counting for Something, by W. S. Peters, which I've enjoyed reading this week. On the statistics of danger and hazards in everyday life, see the works of John Allen Paulos. About that hat: Yes, I am a real Cambridge alumnus (M.Phil. degree, Clare College, 1978) and am still in touch with people there.
|
|||
2005 February 4 |
"Dude"?
I often wear this hat → → → → → →
I was wearing it back in November when our building was evacuated and I found myself in a large, motley crowd of people walking rapidly along the tree-lined avenue that leads northward out of it. Two of these people were girls who seemed about 50 years younger than me - though in reality, I hasten to add, people 50 years younger than me have not been born yet. One of them saw the hat, and, suitably impressed, turned to her companion, said: "Woo! Cambridge University dude!" Since then, that's what Cathy has been calling me. Or, in e-mail, Cambr1dg3 d00d. By the way, I have a tradition known as the Hat Ceremony. Whenever one of my graduate students wins a scholarship to Cambridge, I take the Cambridge hat off my own head and place it on theirs. So far, I'm on the third hat.
|
|||
2005 February 3 |
Garner's oscillator, one last time
I did a few more experiments with Garner's oscillator and found out several things. Garner's original circuit does work if you use germanium transistors similar to his 2N107, CK722, and the like. I had good luck with a Texas Instruments house-numbered transistor with a 1975 date code and hfe about 100. With higher-gain Japanese transistors, which were the only other germanium transistors I had on hand (other than rare antiques!), the circuit latched with both transistors conducting. Any attempt to put a current-limiting resistor between the two transistors, or in the second transistor's emitter circuit, stopped the oscillation. This is a pity because the circuit isn't "safe" - it can damage the transistors if it stops oscillating with both transistors "on." The red herring was in fact the 4.7-kilohm load resistor. It is much too big. This resistor determines the output impedance of the second stage, which limits the amount of drive available for feedback. I changed it to 100 ohms and then was able to get away with inserting a 100-ohm current-limiting resistor between the transistors. With that resistor added, the circuit is "safe." In fact, an 8-ohm speaker can go in place of the 4.7k load. That's probably what I actually did in sixth grade, and it may have even made the germanium version of the circuit oscillate with silicon transistors. What put me on the trail of these improvements was an article, "Complementary Multivibrator," by J. C. Rudge, in Wireless World, August, 1962, pp. 360-361. Below is his circuit (which used silicon transistors), with early 1960s British transistor symbols, and then redrawn in modern notation:
But that's not the last word. All these experiments have made it clear that the biasing of the first transistor is what's critical. Here's a version that requires more power - its bias network conducts continuously - but is even more reliable, since the first transistor is held in its linear region, and RE limits current:
I found it in John Markus, Guidebook of Electronic Circuits (New York: McGraw-Hill, 1974), p. 548, and have not seen the original article. Finally, a distant relative of the same oscillator, with even more complicated but reliable biasing, was used in the original Bell Labs artificial larynx, whose circuit has been published in a number of places, including Garner's Transistor Circuits.
|
|||
2005 February 2 |
Miscellany
I'll resume boring you with Garner's Oscillator tomorrow. In the meantime: Here is a collection of foolish legal reasoning that people use to avoid paying income tax. Some of it is pure fiction ("Ohio never ratified the Sixteenth Amendment" - false), and the rest of it involves twisted definitions of words. Looking for an investment? Here is one that comes highly recommended (Fidelity Four-in-One Index Fund; performance chart here, comparing it to the Dow Jones Average). Through mutual funds you can invest in nearly anything (whether or not it is wise to do so - cf. the municipal bond market). Unless you have good reasons to do otherwise, choose one with low expenses and one where you can deal with the fund directly without requiring a broker.
|
|||
2005 February 1 |
What is good teaching?
Yesterday I was on the editorial page of the campus newspaper with this. (Backup copy here.) Enjoy. Garner's mysterious oscillator I promised I'd say a little more about the oscillator circuit from Louis E. Garner's Transistor Circuits (2nd ed., Coyne Electrical School, 1960) that I mentioned on January 27. This is a 2-transistor complementary relaxation oscillator which Garner classifies as a type of astable multivibrator. Here's his diagram of it:
Notice that the NPN transistor is drawn with a PNP transistor symbol. The transistors are meant to be common germanium types of the late 1950s. As I mentioned, I built this circuit in sixth grade (1968) and it made squealing noises in a speaker, which, as best I recollect, was in series with the battery (probably 9 volts rather than 6). I used silicon transistors, types 2N3638 and 2N3646. Well... The circuit has been hard to analyze or simulate. Here it is, redrawn in modern notation:
The first thing we notice is that the circuit looks dangerous. If the first transistor conducts steadily, it will send excessive current into the base of the second one, which will burn out. But then we notice that, in fact, the first transistor is biased not to conduct. It looks as if the circuit will come to rest in a stable state with neither transistor conducting (just as if the capacitor were not there). And that's exactly what it does. This evening, I breadboarded it and confirmed that the actual circuit - not just the computer simulation - does not oscillate. I used newer transistors, types 2N4401 and 2N4403 (hfe around 200) as well as older, low-gain types 2N3638 and 2N3646, all silicon. Then, on a hunch, I tried making one change, and then the circuit worked as Garner described:
The timing resistor needs to bias the first transistor into conduction, not into cutoff. The oscillator will then run on any voltage from about 1.2 V up, and the frequency is independent of voltage until you get to about 7 V (where, presumably, reverse base-emitter conduction starts to help discharge the capacitor). So... Did Garner's circuit work, as published, or did he make a mistake? It's possible his circuit worked with germanium transistors, which bias themselves slightly into conduction unless prevented. The second transistor might have enough leakage to ensure regeneration with Garner's original circuit. I'm going to make some tests. But what about the version I built in sixth grade, with silicon transistors? It worked. Maybe, by mistake, I altered the circuit just as shown in my second diagram. There's some room for confusion; Garner uses the ground symbol for the positive supply, which is slightly unconventional, and it may have led me to make one connection incorrectly. Or maybe he drew the circuit incorrectly, and the circuit he actually built was the same as my second diagram. We know there's one mistake in Garner's diagram; maybe there are two. [For the answer, see February 3.]
|