jump to navigation

Yes, We may Have Made a Mistake. June 3, 2011

Posted by gordonwatts in ATLAS, computers.

No, no. I’m not talking about this. A few months ago I wondered if, short of generating our own reality, ATLAS made a mistake. The discussion was over source control systems:

Subversion, Mercurial, and Git are all source code version control systems. When an experiment says we have 10 million lines of code – all that code is kept in one of these systems. The systems are fantastic – they can track exactly who made what modifications to any file under their control. It is how we keep anarchy from breaking out as >1000 people develop the source code that makes ATLAS (or any other large experiment) go.

Yes, another geeky post. Skip over it if you can’t stand this stuff.

ATLAS has switched some time ago from a system called cvs to svn. The two systems are very much a like: centralized, top-down control. Old school. However, the internet happened. And, more to the point, the Cathedral and the Bazaar happened. New source control systems have sprung up. In particular, Mercurial and git. These systems are distributed. Rather than asking for permission to make modifications to the software, you just point your source control client at the main source and hit copy. Then you can start making modifications to your hearts content. When you are done you let the owner of the repository know and tell them where your repository is – and they then copy your changes back! The key here is that you had your own copy of the repository – so you could make multiple modifications w/out asking the owner. Heck, you could even send your modifications to your friends for testing before asking the owner to copy them back.

That is why it is called distributed source control. Heck, you can even make modifications to the source at 30,000 feet (when no wifi is available).

When I wrote that first blog post I’d never tried anything but the old school source controls. I’ve not spent the last 5 months using Mercurial – one of the new style systems. And I’m sold. Frankly, I have no idea how you’d convert the 10 million+ lines of code in ATLAS to something like this, but if there is a sensible way to convert to git or mercurial then I’m completely in favor. Just about everything is easier with these tools… I’ve never done branch development in SVN, for example. But in Mercurial I use it all the time… because it just works. And I’m constantly flipping my development directory from one branch to another because it takes seconds – not minutes. And despite all of this I’ve only once had to deal with merge conflicts. If you look at SVN the wrong way it will give you merge conflicts.

All this said, I have no idea how git or Mercurial would scale. Clearly it isn’t reasonable to copy the repository for 10+ million lines of code onto your portable to develop one small package. But if we could figure that out, and if it integrated well into the ATLAS production builds, well, that would be fantastic.

If you are starting a small stand alone project and you can choose your source control system, I’d definitely recommend trying one of these two modern tools.

The Ethics and Public Relations Implications of asking for help April 25, 2011

Posted by gordonwatts in Large Collaborations, physics life.

I’ve been having a debate with a few friends of mine. I have definite opinions. First, I’ll lay out the questions. The span ethics and also potential PR backlash. These conversations, btw, are all with friends – no one important, so don’t read anything into this! This is long, and my answers are even longer, but I hope a few of you will read and post (yes, everyone is busy)!

Lets take a purely hypothetical situation. A person has joined a large scientific collaboration like CDF, DZERO, ATLAS, or CMS. As part of joining they agree to abide by a set of rules. For example, not discussing an analysis publically before it has been approved by the experiment.

I apologize in advance to those who are not part of this life, or who don’t care. This blog posting will be even less interesting than normal!

Here are the questions. I’m curious about the answers from both an ethics point of view and a political point of view. Or any other point of view you care to bring to bear. I’ve put my answers below. The setup below is hypothetical! And I have some personal issues with #7! #8 is the one I’ve gotten most push back on when talking with people.

  1. You are a member of said collaboration and you anonymously post all or part of an internal document to a blog.
  2. You are a member of said collaboration and you post non-anonymously to a blog.
  3. The blog owner(s) are unaffiliated with any experiment. Are they obligated to take it down?
  4. The blog owner is affiliated with the experiment (e.g. say someone posted an internal DZERO or ATLAS abstract to my blog). Are they obligated to take it down?
  5. Is it ok for the experiment to ask the blogger to reveal the posters information? For example, the wordpress blogging platform, which I use, keeps internally a record, visible to me, of the posters IP address, which might be able to identify the poster. Is the answer any different if the blog owner is a member of the same experiment? How about a member of a competing/different experiment?
  6. Does the blog owner have to respond with the information to the experiment?
  7. What if the blog owner is a member of the same experiment? Do they have to respond then?
  8. Does the experiment have to ask the blog owner for help?

Ok. So, here are my answers. These aren’t completely thought out, so feel free to call me out if I’m not being consistent. And these are my opinions below, no matter how strongly I state them.

  1. This is clearly unethical. You are violating something that you agreed to in the first place, voluntarily. Further, by doing this anonymously you are basically trying to get away without being accountable – so you are taking no responsibility for your actions – which is also unethical. The PR result depends, obviously, on what is posted. If the topic is interesting enough to the mainstream, articles will end up on the mainstream news sites. If this damages the credibility of an actual result when it is released then real harm has been done. It is not likely that it will damage the credibility within the field, however.
  2. For me this is more murky. You clearly have violated the agreement that you signed initially. But you have also made it clear who you were when you posted it – so you are taking responsibility and accepting the consequences for your actions. The first half you are not behaving ethically, but the second half you are. It seems the PR consequences are similar, except they will be much more personal because the press will be able to get in touch with you. A large faceless experiment, like DZERO or ATLAS, will have a much harder time countering this (people make better stories!).
  3. Ethically, I don’t think you are obligated to take it down if you are not affiliated with any experiment. That was someone else’s agreement, and not one that you signed up for. I follow the thinking of various places that deal with whistleblowers. Now, the blog owner may have their own set of ethical guidelines for the blog, for example, “I will not traffic in rumors,” and then ethically they should not make an exception for a particular post. But that is strictly up to them – they could just as easily say that “this blog traffics in rumors!” The PR aspect of this really depends, if the blog is up front about what it is, then the PR won’t reflect on it as much as it will reflect on the rumor. If the blog does something that violates its own guidelines – like normally it ignores rumors except in this particular one because it is a big one – then part of the PR will be focused back on them. This is a wash, in my opinion.
  4. If the blog was owned by a member of the same experiment then I do think they would be obligated to take it down. The blog owner, upon joining the experiment, agreed not to reveal secrets, and the blog is an extension of the person who made the agreement. From a PR perspective, this would put the blog owner in a fairly difficult position! First, most of us small-time blogs allow comments w/out waiting for approval, so it could be up for several hours before it gets taken down. Any of the RSS comment aggregators would easily have time to grab it before it disappeared. So, it would be out there for anyone with a bit of skill even if it had already been taken down. So the PR would, basically, be the same as the other case. But, if any press came to call the blog owner they would have to say “No Comment.” Ha!
  5. So, it is fine for the experiment to ask the blog owner for any identifiable information about the poster. They are not violating any of their ethics. The PR response, however, can vary dramatically. After the experiment asks, the blogger could respond “Yes” or “No”. And then everyone moves on. But the blogger could also post a copy of the request and say something like “This 3000 person scientific organization is putting pressure on my to reveal my sources. This is a clear suppression of free speech, etc. etc.” What happens next is anybody’s guess and really depends on the blogger’s reputation, their popularity, who picks it up and runs with it, etc. So, anything from forgotten to a PR nightmare for the experiment. For a blogger that wants to prove that they will keep their rumor sources confidential – and thus get more rumors, this could be a big plus. Add this to the likelihood that there is no identifiable information, this makes me conclude it isn’t worth it. Now, if the blogger is a member of the experiment, or the blogger is well known to individuals on the experiment, a small conversation can happen over the phone or in person to see if the blogger might be willing to help out.
  6. First, if the blogger is not a member of the experiment. In this case, I do not think there is any ethical reason for the blogger to respond. By the same token, I do not think the experiment can get bent-out-of-shape if the blogger declines to help. I don’t think there is any real PR aspect to this question (other than what was above). Something to keep in mind: depending on the severity of the leak, you may be ending or seriously affecting someone’s career (judge/jury/etc.) by giving up that technical information – which could be spoofed.
  7. Now, if the blogger was on the same experiment, then things get more tricky. Ethically, you agreed to keep your experiment’s secrets, but you didn’t agree to tattle tail on a fellow collaboration member. I feel like I’m on thin ice here, so any comments yes or no to this would be helpful – especially because I could see myself in this position! While that may be the case, the experiment could bring a huge amount of peer pressure to bear on the blog author if they are a member. This effect should not be underestimated.
  8. This may seem like an odd question. Think of it from this point of view. An internal document has just been leaked. You are one of 3000 people working hard on this experiment. Something that you’ve had no input into, and perhaps seriously disagree with, has been put out on the web. You are still bound by the agreement with the collaboration so you can’t counter why you think it is bad. You have to sand by, frustrated, as this document is discussed by everyone except the people it should be discussed by. Worse, what if this person who did the posting gets away with it!? There are no consequences to what they did? Worse, what if the collaboration changes the way it does internal reviews and physics in order to keep things more secret from even its own members to lessen the chances of another leak? Now the person doing the leak has seriously impacted your ability to work and nothing has happened to you. So, should the collaboration do all it can to track this leaker down? Whew. Yes. But what if tracking this person down causes more damage (like the free speech PR nightmare I mentioned above)? I have a lot of trouble answering this question. In isolation the answer to this is clearly yes. However, when the various possible outcomes are considered, it feels to me like it isn’t worth it.

One final thing. As far as I can see, it seems to me that no actual laws have been broken by any of the proposed actions. That is, you couldn’t sue in a court of law for any of the actions. There is no publically recognized contract, for example. Do people agree with that? Any key questions I missed that should be in the above list?

Scientific Integrity April 22, 2011

Posted by gordonwatts in physics, physics life, politics, press, science.

… means not telling only half the result

… means not mis-crediting a result

… means an obligation to society to not falsify results

… means not making false claims to gain exposure

… means respecting your fellow scientist and their results

means not talking about things that aren’t public (or, say, that haven’t undergone an internal review)

… means playing by the rules you agreed to when you enter into a collaboration

It means being a scientist!

Integrity is more important that ever given how much the public eye is focused on us in particle physics.

Update: I should mention that this post was authored with Alison Lister.

Global Entry–Just Get It April 20, 2011

Posted by gordonwatts in travel.
add a comment

A month or two ago I was traveling back from Geneva with a friend of mine. Kaori and I were on a flight that was late – about an hour late. We landed at IAD and really had to race to make our connections (we had less than an hour). We raced to immigration and I got in line. Looking around – I couldn’t find her… looking over to the side, I saw her at some kiosk… in about a minute or so she was racing through to the baggage pick up. Me… I hung out in the line for about 5 minutes.

She was using the Global Entry program. Having signed up and used it for my most recent flight… I’m a fan. It is fairly cheap – $100 bucks for 5 years. You do have to give up finger prints and picture to the US government – as far as I know that is the first set of finger prints any government agency has on record for me – so that was a little weird. As an example, on my last flight into IAD the plane doors were opened at 4:10 pm. At 4:22 pm I was in the X-Ray line. This included more than 5 minutes of walking since our plane was waaaay down the terminal. You use a kiosk instead of a person in the immigration area. I’d say it took the same amount of time as dealing with an officer who decided not to ask any question and if there were no lines – about 90 seconds or so. Extra bonus: no filling out those @*#&@ blue custom forms (there is an abbreviated version on the kiosk). And, when you go through customs, there is a separate line that lets you cut to the front (at least, in IAD). You just hand them a bit of paper that the immigration kiosk printed out and you are done.

I could imagine there are a number of circumstances that don’t make this worth it. If you always travel with kids under 14 you can’t use this (well, the kids can’t use this), if you always check baggage the time saved will be a small fraction of your total time, and I think there are only about 20 airports that support it (these are where your international ports-of-entry). Oh, and if you like watching people while standing in lines to relax after that long flight being cooped up… then this isn’t for you either.

My flight into IAD earlier this week was over an hour late. I had less than an hour to connect. A student of mine and I were both on the plane and both were on the connecting flight to Seattle. Neither of us had bags checked. The Seattle flight was in D29 in IAD (which means a long walk). I did a brisk walk and made it before boarding started. He had to sprint some of the way and made it after everyone had already boarded – but he still made it. BTW – I was also able to skip to the front of the X-Ray line which can be killer in IAD because I’d been upgraded on that last leg. That probably saved me an additional 10 minutes or so on this trip.

So… I’d recommend getting this if you flight internationally with any frequency. It definitely made that part of my trip quicker and, thus, more enjoyable!

As a side note… WHY don’t they design the airport so that if you don’t have to pickup your luggage you don’t have to go thought security again?

Cherry Blossoms April 7, 2011

Posted by gordonwatts in life, University of Washington.
1 comment so far


It happens once a year, of course: Cherry Blossom Season. You can find it all over – Japan is famous for it. But back at the University of Washington we have our own little grove of Yoshino Cherry trees on the Quad. For the two weeks or so the place becomes a bit of a tourist destination – it is packed with people. Some just sitting and reading, but most walking around and snapping pictures. I went a little crazy this year. If you love this stuff, you can find it all over the web. Here are links to some of the stuff I’ve taken:

  • Pictures from a cloudy day on flickr.
  • A large panorama view. This is probably the easiest one to get an understanding of what the square looks like.
  • A giant 451 photo 3D reconstruction (a photosynth). I’m really looking forward to the technology (recently previewed) where you can walk around with a video camera and that is enough to build one of these!
  • A desktop theme pack for Windows 7. If you like having your background image change every 30 minutes to a different view of cherry trees, well, this is for you!

Enough till next year!

Jumping the Gun April 4, 2011

Posted by gordonwatts in Uncategorized.

The internet has come to physics. Well, I guess CERN invented the internet, but, when it comes to science, our field usually moves at a reasonable pace – not too fast, but not (I hope) too slow. That is changing, however, and I fear some of the reactions in the field.

The first I heard about this phenomena was some results presented by the PAMELA experiment. The results were very interesting – perhaps indicating dark matter. The scientists showed a plot at a conference to show where they were, but explicitly didn’t put the plot into any public web page or paper to indicate they weren’t done analyzing the results or understanding their systematic errors. A few days later a paper showed up on arXiv (which I cannot locate) using a picture taken during the conference while the plot was being shown. Of course, the obvious thing to do here is: not talk about results before they are ready. I and most other people in the field looked at that and thought that these guys were getting a crash course in how to release results. The rule is: you don’t show anything until you are ready. You keep it hidden. You don’t talk about it. You don’t even acknowledge the existence of an analysis unless you are actually releasing results you are ready for the world to get its hands on and play with it as it may.

I’m sure something like that has happened since, but I’ve not really noticed it. But a paper out on the archives on April 1 (yes) seems to have done it again. This is a paper on a Z’ set of models that might explain a number of the small discrepancies at the Tevatron. A number of the results they reference are released and endorsed by the collaborations. But there is one source that isn’t – it is a thesis: Measurement of WW+WZ Production Cross Section and Study of the Dijet Mass Spectrum in the l-nu + Jets Final State at CDF (really big download). So here are a group of theorists, basically, announcing a CDF result to the world. That makes a bit uncomfortable. What is worse, however, is how they reference it:

In particular, the CDF collaboration has very recently reported the observation of a 3.3 excess in their distribution of events with a leptonically decaying W+- and a pair of jets [12].

I’ve not seen any paper released by the CDF collaboration yet – so that above statement is definitely not true. I’ve heard rumors that the result will soon be released, but they are rumors. And I have no idea what the actual plot will look like once it has gone through the full CDF review process. And neither do the theorists.

Large experiments like CDF, D0, ATLAS, CMS, etc. all have strict rules on what you are allowed to show. If I’m working on a new result and it hasn’t been approved, I am not allowed to even show my work to others in my department except under a very constrained set of circumstances*. The point is to prevent this sort of paper from happening. But a thesis, which was the source here, is a different matter. All universities that I know of demand that a thesis be public (as they should). And frequently a thesis will show work that is in progress from the experiment’s point of view – so they are a great way to look and see what is going on inside the experiment. However, now with search engines one can do exactly the above with relative ease.

There are all sorts of potential for over-reaction here.

On the experiment’s side they may want to put restrictions on what can be written in a thesis. This would be punishing the student for someone else’s actions, which we can’t allow.

On the other hand, there has to be a code-of-standards that is followed by people writing papers based on experimental results. If you can’t find the plot on the experiment’s public results pages then you can’t claim that the collaboration backs it. People scouring the theses for results (as you can bet there will be more now) should get a better understanding of the quality level of those results: sometimes they are exactly the plots that will show up in a paper, other times they are an early version of the result.

Personally, I’d be quite happy if results found in theses would stimulate conversation and models – and those could be published or submitted to the archive – but then one would hold off making experimental comparisons until the results were public by the collaboration.

The internet is here – and this information is now available much more quickly than before. There is much less hiding-thru-obscurity than there has been in the past, so we all have to adjust. Smile

* Exceptions are made for things like job interviews, students presenting at national conventions, etc.

Update: CDF has released the paper

Digitize the world of books March 26, 2011

Posted by gordonwatts in Books, physics life.

Those of you watching would have noticed that a judge threw a spanner in the plans of Google to digitize the world’s book collection:

The company’s plan to digitize every book ever published and make them widely available was derailed on Tuesday when a federal judge in New York rejected a sweeping $125 million legal settlement the company had worked out with groups representing authors and publishers.

I am a huge fan of the basic idea. Every book online and digital and accessible from your computer. I’m already almost living the life professionally: all the journal articles I use are online. The physics preprint archive, arivx.org, started this model and as a result has spawned new types of conversation – papers that are never submitted to journals. Pretty much the only time I walk over to the library is to look at some textbook up there. The idea of doing the same thing to all the books – well I’m a huge fan.

However, I do not like the idea of one company being the gateway to something like that. Most of the world’s knowledge is written down in one form or another – it should not be locked away behind some wall that is controlled by one company.

I’d rather see a model where we expect, in the long term, that all books and copyrighted materials will eventually enter the public domain. At that point they should be easily accessible online. When you think of the problem like this it seems like there is an obvious answer: the Library of Congress.

Copyrighted books are a tougher nut to crack. There publishers and authors presumably will still want to make money off this. And making out-of-print books available will offer some income (though not much – there is usually a reason those books are out of print). In this case the Google plan isn’t too bad – but having watched journals price gouge because they can, I’m very leery of seeing this happen again here. I’d rather see an independent entity setup that will act as a clearing house. Perhaps they aren’t consumer facing – rather they sell access and charge for books to various companies that then make the material available to us end users. This model is similar to what is done in the music business. I purchase (or rent) my music through Zune – I don’t deal directly with any of the record labels. The only problem is this model doesn’t have competition to keep prices down (i.e. nothing stops this one entity from price gouging).

Lastly, I think having all this data available will open a number of opportunities for things we can think of now. But I think that we need to make sure the data is also available in a raw form so that people can innovate.

Print books are dying. Some forms will take longer than others – I would expect the coffee table picture book to take longer before it converts to all digital than a paper-back novel. But I’m pretty confident that the switch is well underway now. What we do with all the print books is a crucial question. I do think we should be spending money on moving these books into the digital age. Not only are they the sum of our knowledge, but they are also a record of our society.

Under Attack March 23, 2011

Posted by gordonwatts in DOE, university, University of Washington.

I’ve been trying not to make a comment on the budget situation in the USA. Or on the current discussion about teacher pay and benefits. Or about the state of science funding in this budget atmosphere. Or the drive to eliminate the Department of Education. Or the revival of the teach the controversy push. Others have made the case much more eloquently than I could have. This is more of a personal take on some of this: I’ve never felt under attack quite the way I do right now.

There seems to be a concerted attack on science funding in the US at the federal level. The feds fund most research that is too long term for a company to fund – which is becoming more and more as the stock market forces companies to think more and more short term. A healthy research program in a country needs to contain a balance for the sake of the long-term health of the economy. And a healthy economy is the only way to make jobs. The large cuts that are reputed to befall the Office of Science, which funds most of the national labs, will force lab closures. Facilities where we do science – gone! 1000’s of people layed off. Heck, if you are trying to cut out 60 billion you can take a guess as to how many jobs that is worth. At $100,000 per person per year – so really nice jobs! – that is another .6 million added to the unemployment roles. Right. That’s going to turn out well!

Second is this constant discussion about teacher pay. I’ve seen comments on newspaper articles with statements like “we are just paying them to babysit our kids.” Seriously?? Maybe we should just eliminate the schools and have the kids all at home. No formalized education system. Now, that has never been done before! And so obviously it must be better! Oh… wait. I guess it has been done before. I think it was called the middle ages… Arrgh! Yes, our K-12 system needs some real work. But beating the crap out of teachers in newspapers is not the way to get good people into the classroom! And the idea that teachers are overpaid paid? Seriously? [I’m not trying to channel Grey’s Anatomy here] I find that hard to believe. Perhaps they are getting better retirement plans for what they are paid – but I suspect that is because when the unions couldn’t negotiate a pay raise – so they went for an increase in the pension. I wonder if you paid teachers a more fair wage, but kept their pension plans the same size, if the rate would be more in line with normal?

On a more local note, one of our state legislators was heard to say “Higher education is a luxury we no longer can afford.” I don’t even know where to start with that. Washington is like every other state, it has some rich people and some poor people. UW is a state school – the state provides subsidies for the in-state students to make it more affordable. A robust state and federal scholarship program back fill for people really in need. The idea is if you are good and you want to get a higher level education, the federal government, the state government, and the university will do its best to make sure that finances do not get in your way. This has been a bedrock of all higher education in the USA for many years now. Do we go back to a class based system? What are people thinking, really? I get they are trying to cut the budget, but think for a few minutes about the implications of what you are saying!

And to those who say education is radically more expensive than it has been in the past – at the UW that is definitely true that the cost an instate student pays has gone up a lot over the last 10-15 years. Definitely more than inflation(by a bit). But if you look at the amount of $$ the university pays to educate a single student that has remained almost constant. Wait. For. It… That is right! State support has dropped dramatically. So the university has to cut expenses and find other sources of income – i.e. raise tuition. Blaming the university for this is misplaced. Last year in the state of Washington after the state legislature cut the UW funding by 26% the university raised tuition by 14% over two years. Legislatures were known to stand up at town halls, etc., and express their displeasure at UW for doing that in hard economic times. I’m happy with them being displeased – I was displeased – but at least be honest and say that the state cut 26% of the university’s funding. It isn’t like that was a capricious raise!

Next is another is the push to increase the teaching load. I currently teach one class a quarter – so three a year (I get paid for only the 9 months that I’m teaching – I have to find my own funding for the rest of the year). That one class is about 3 hours in the class room in front of students. Pretty cushy, eh!? I taught graduate particle physics this year. This is my third year so I’d like to think that I know it by now (not) – but all told during the week it would take about 20 hours of my time. The first time I taught it – when I had to teach myself some field theory – it was taking more like 50 hours a week. When I teach the easier undergraduate courses I tend to have 100’s of students – so it also works out to be about 20 hours a week. Some weeks a lot less, some a lot more. So, it would seem I have at least enough time to take on another course! Except there is one big problem here – my job isn’t just to teach undergraduates. My job is to also teach graduate students, mentor post-docs, and do research. UW is the #1 public institution in the USA when it comes to bringing in $$ from grants. You add another class, then you will effectively change the nature of the University of Washington – make it a teaching institution rather than a research institution. The ramifications of something like that are huge – rankings, desirability, research & undergrads, etc. Do people to say things like this understand how all this is connected?

This last election brought in a lot of new people (at least at the federal level). I remember being elected to a few positions having to do with HEP. I had all sorts of ideas – but I discovered that when I arrived that all the decisions that had been made were all made for a reason! They weren’t arbitrary. You can’t go wrecking around like a bull in a china shop – you have to carefully consider what you are doing and the ramifications. I get the feeling many of these new folks just don’t care. Really just don’t care. Even worse, they don’t know history – which means they are doomed to repeat it. Many of the ideas on the table around America have been tried before – if not here, then other places. I would love them to take a careful look. There is plenty of room for new things to achieve some of the same goals – why not try them rather than closing your eyes and just letting the knife fall where it may? In physics we call this a “prescale” – we just randomly through out data because we have too much. Here we are randomly throwing out programs because we have too little. In both cases this is an implicit admission of defeat: we aren’t smart enough to make a strategic cut.

Ok. Enough. Thank goodness there is a counter balance in most cases to these drives to change things so radically. It won’t be pleasant, but the system is too large and what comes out of it too valuable to actually destroy it in a few short years, despite best efforts of some. Now that I’ve vented, back to working on my classes and my research!

Update: Fixed “under paid” –> “over paid”. Of all the typo’s! Smile

We’re Broke… or not… where is the data!? January 26, 2011

Posted by gordonwatts in DOE, NSF, science, University of Washington, USA.

It is hard for me not to feel very depressed about the way government funding is going in Washington. Especially all the “cuts” that keep being  mentioned. So I thought I’d spend an hour doing my best to understand what cuts are being talked about. Ha! Sheer fantasy!

Before I write more, I should point out that I very much have a dog in this race. Actually, perhaps a bit more than one dog. Funding for almost all my research activities comes via the National Science Foundation (NSF) – this is funded directly by congress. My ability to hire post-docs and graduate students, train them, do the physics – everything, is dependent on that stream of money. Also, two months of salary a year come from that stream. In short, almost everything except for the bulk of my pay. That comes from two sources: state of Washington and student’s tuition. A further chunk of money comes from the Department of Energy’s (DOE) Office of Science – they fund the national labs where I do my research, for example. In short, particle physics does not exist without government funding.

So when people start talking about large, across-the-board cuts in funding levels I get quite nervous. Many republicans in 2010 campaigned on cutting back the budget, hard:

“We’re broke, and decisive action is needed to help our economy get back to creating jobs and end the spending binge in Washington that threatens our children’s future,” Mr. Boehner said.

Up until recently they really haven’t said how they were going to do it – a typical political ploy. But now things are starting to show up: cut funding to 2008 levels, and then no increases to counter inflation. The latter amounts to a 2-3% cut per year. No so bad for one year but when you hit 3-4 it starts to add up. You’ll have to let go a student or perhaps down-size a post-doc to a student.

But what about all these other cuts? So… I’m a scientist and I want to know: Where’s the data!? Well, as any of you who aren’t expert in the ways of Washington… boy is it hard to figure out what they really want to do. I suppose this is to their advantage. I did find out some numbers. For example, here is the NSF’s budget page. 2008 funding level was $6.065 billion. In 2010 it was funded at a rate of $6.9 billion. So dropping from 2010 back to 2008 would be a 12% cut. So, if that was cut blindly (which it can’t – there are big projects and small ones and some might be cut or protected), that would translate into the loss of about one post-doc, perhaps a bit more. In a group our size we would definitely notice that!

But is that data right? While I was searching the web I stumbled on this page, from the Heritage foundation, which seems to claim reducing the NSF to 2008 levels will save $1.7 billion, about x2 more than it looks like above. Who is right? I know I tend to believe the NSF’s web page is more reliable. But, seriously, is it even possible for a citizen who doesn’t want to spend days or weeks to gather enough real data to make an independently informed decision?

Check out this recent article from the NYTimes about a recent proposal coming from  Congressman Jordan whose goal is to reduce federal spending by $2.5 trillion through fiscal year 2021 (am I the only one that finds the wording of that title misleading?). As a science/data guy the first thing I want to know is: where is he getting all that savings from? There are lists of programs that are eliminated, frozen, or otherwise reduced – but that document contains no numbers at all. And I can’t find any supporting documentation that he and his staff must have in order of have made that $2.5 trillion claim. So, in that document, which is 80 pages long, I’m left scanning for the words “national science foundation”, “science”, “energy”, etc. Really, there is very little mentioned. But I have a very hard time believing that those programs are untouched – as the article in the new york times points out, since things like Medicare, Social Security, etc., are left untouched (the lions share of the budget – especially in out years), and so all the cuts must come from other programs:

As a result, its effect on the entire array of government programs, among them education, domestic security, transportation, law enforcement and medical research, would be nothing short of drastic.

I agree with that statement. 2.25 trillion is a lot of cash! Can you find the drastic lines in that document? Well, perhaps you know more about Washington. I can’t. This gets to me because now if I have to get into an argument it is a very abstract one.

Pipedream: What I would love these folks to do is release a giant spreadsheet of the US gov’t spending that had 2008, 2009, 2010 levels, and then their proposed cuts, with an extra column for extra text. That is a lot of data, and would probably be hard to compile. But, boy, it would be nice!

Tests are Good for You January 21, 2011

Posted by gordonwatts in Teaching, university, University of Washington.

The New York Times had an article the other day talking about a discovery that is making rounds:

Taking a test is not just a passive mechanism for assessing how much people know, according to new research. It actually helps people learn, and it works better than a number of other studying techniques.

I’m here to tell you: duh!

In fact, we’ve institutionalized this in our physics graduate schools. Most university physics departments have the mother-of-all tests. Here at UW we call it the Qualifying Exam. Others call it a prelim (short for preliminary). And there is a joke associated with this exam, usually said with some bitterness if you’ve not passed it yet, or some wistfulness if you long since have passed it:

You know more physics the day you take the qual than you ever do at any other time in your life.

The exam usually happens at the end of your first year in graduate school. The first year classes are hell. Up to that point in my life it was the hardest I’d ever worked at school. Then the summer hits, and you get a small rest. But it is impossible to rest staring down the barrel of that exam, often given at the end of the summer just before the second year of classes start. You have to pass this exam in order to go on to get your Ph.D. And for most of us, it is the last (formal) exam in our career that actually matters. So physiologically, it is a big hurdle as well.

How hard is it? My standard advice to students is that they should spend about one month studying, 8 hours a day. For most people, if they study effectively, that is enough to get by. Some need less and some need more. This is about what it took me. What is the test like? At UW ours is 2 hours per topic, closed book, and all it is is working out problems. No multiple choice here! It lasts two days.

So, how do you study? There is, I think, really only one way to get past this. For 30 days, 8 hours a day, work out problems. There are lots of old qualifier problems on websites. Our department provides students with copies of all the old exams. Even if you don’t know the solution, you force your self to try to work it out with out looking it up in a book – break your brain on it. Once you can solve those problems with out having to look at a text book, you know you are ready. Imagine trying to study by reading a text book, or by reviewing your first year homework problems. There is no way your brain will be able to work out a new problem after that unless you are a very unique individual.

Note how similar this is to the results shown in the article:

In the first experiment, the students were divided into four groups. One did nothing more than read the text for five minutes. Another studied the passage in four consecutive five-minute sessions.

A third group engaged in “concept mapping,” in which, with the passage in front of them, they arranged information from the passage into a kind of diagram, writing details and ideas in hand-drawn bubbles and linking the bubbles in an organized way.

The final group took a “retrieval practice” test. Without the passage in front of them, they wrote what they remembered in a free-form essay for 10 minutes. Then they reread the passage and took another retrieval practice test.

The last group did the best, as you might imagine from the theme of this post!

This is also how you know more physics than at any other time in your life. At no other time do you spend 30 days working out problems across such a broad spectrum of physics topics. If you study and try to work out a sufficiently broad spectrum of problems you can breeze through the exam (literally, I remember watching one guy taking it with me just nail the exam in about half the time of the rest of us).

Working out problems  – without any aids – is active learning. I suppose you could follow the article and say that forcing the brain to come up with the solution means it organizes the information in a better way… Actually, I have no idea what the brain does. But, so far this seems to be the best way to teach yourself. You are actively playing with the new concepts and topics. This is why homework is absolutely key to a good education. And this is why tests are good – if you study correctly. If you actively study for the test (vs. just reading the material) then you will learn the material better.

And we need to work better at designing tests that force students to study actively. For example, I feel we are slipping backwards sometimes. With the large budget cuts that universities are suffering one byproduct is the amount of money we have to hire TA’s to help grade our large undergraduate classes is dropping. That means we can’t ask as many open-ended exam questions – and have to increase the fraction of multiple choice. It is much harder to design a test that goes after problem solving in physics using multiple choice. This is too bad.

So, is this qualifier test hazing process? Or is there a reason to do it? Actually, that is a point of controversy. Maybe there is a way to force the studying component without the high-anxiety of the make-or-break exam. Certainly some (very good) institutions have eliminated the qual. Now, if we could figure out how to do that and still get the learning results we want…


Get every new post delivered to your Inbox.

Join 67 other followers