Solstice drift, and how to fix it

The summer and winter solstices happen around the 20th of June and December, respectively. Around the 20th? That seems rather… imprecise, for an astronomical event with a precise definition: the time at which the Sun reaches its “highest or lowest excursion relative to the celestial equator on the celestial sphere” or, for the viewer standing on the Earth, its highest or lowest altitude from the horizon. This is determined by the Earth’s orbit and corresponds to the time at which your current hemisphere’s pole points most closely to, or farthest from, the Sun. So why doesn’t it happen at the same time each year?

Inspired by an awesome book I recently acquired (“Engaging in Astronomical Inquiry” by Slater, Slater, and Lyons), I decided to investigate. I used the Heavens Above site to pull up historical data for the summer and winter solstices going back to 1980. I plotted the time for each solstice (in Pacific time) as its offset from some nearby day (June 20 or December 21). And sure enough, here’s what you get:

The solstice time gets later by about 6 hours each year, until a leap year, when it resets back by 24 – 6 = 18 hours.

Of course, the solstice isn’t really changing. The apparent change is caused by the mismatch between our calendar, which is counted in days (rotations of the Earth), and our orbit, which is counted in revolutions around the Sun. If each rotation took 1/365th of a revolution, we’d be fine, and no leap years would be needed. But since we’re actually about 6 hours short, every 4 years we need to catch up by a full rotation (day).

Now, we all know about leap years and leap days. But this is the first time I’ve seen it exhibited in this way.

Further, you can also see a gradual downward trend, which is due to the fact that it isn’t *exactly* 6 hours off each year. It’s a little less than that: 5 hours, 48 minutes, and 46 seconds. So a full day’s correction every four years is a little too much. That’s why, typically, every 100 years we fail to add a leap day (e.g., 1700, 1800, 1900). 11.25 minutes per year * 100 years = 1125 minutes, and there are 1440 minutes in a day. But that’s not a perfect match either… which is why every 400 years, we DO have a leap day anyway, as we did in the year 2000.

This is what, in computer science, we call a hack.

And now it is evident why for every other planet, we measure local planet time in terms of solar longitude (or Ls). This is the fraction of the planet’s orbit around the Sun, and it varies from 0o to 360o. It’s not dependent on how quickly the planet rotates. It’s still useful to know how long a planet’s day is, but this way you don’t have to go through awkward gyrations if the year is not an integral multiple of the day.

By the way, you can get a free PDF version of ‘Engaging in Astronomical Inquiry’. If you try it out, I’d love to hear what you think!

Science is sexy (per Ira Flatow)

While at the American Geophysical Union meeting this week, I got to attend a lecture by Ira Flatow (host of NPR’s Science Friday). In his talk, “Science is sexy,” he argued that the image of science has changed from that of the scruffy-haired, wrinkled professor to a younger, fresher, more attractive one. As evidence, he cited the “Mohawk guy,” Mythbusters, and several movies and Broadway shows that have employed a scientific theme or concept. Popularizing science is one thing, but sexualizing it is another. He offered a smorgasbord of other examples that ranged from amusing (the Big Bang Theory, or “what would happen if Friends were physicists”) to offensive (“It’s a Girl Thing” video) to delightfully clever (“The Longest Time” video on divergent evolution).

Ira also cited a paper called “The 95% Solution” which he reported as saying that Americans get only 5% of their knowledge about science from formal education (school). The rest of it, he said, comes from libraries, field trips, aquariums, the Internet, TV, and of course radio shows like Science Friday, which oddly he did not cite. :) But this rather surprising 5% claim turns out not to be quite what the original paper says. The authors note that the average person spends only 5% *of their lifespan* in a classroom. (I am not the average person.) The article encourages directing more resources to these other, non-classroom settings since they in theory have the chance to educate the population in “the other 95%.”

I liked their depiction of the U.S. as having a “vibrant free-choice science learning landscape” — i.e., we enjoy a wealth of opportunities for learning about science. But the article’s deeper argument is a little more radical. If only 5% of one’s lifetime is spent in school, it argues, then the current “school-first” approach of trying to get more and better qualified science teachers in classrooms is not only misdirected, but it doesn’t work. The article notes that although U.S. schoolchildren lag behind their international peers in science literacy, U.S. adults outperform their counterparts (but the article cited no data source, and a web search I conducted suggested that this comes from a study showing that a whopping 28% of U.S. adults have basic science literacy, not exactly a stellar performance, and only barely edging out other countries). Since “only 30% of U.S. adults have ever taken even one college-level science course,” the authors conclude that U.S. adults have been learning science from all of these other sources, not from school.

It’s great that people can learn new things from the world around them, and from informative displays and exhibits that have been set up. But it’s not really surprising; humans are natural scientists and experimenters, as anyone knows who’s watched a toddler for more than two minutes. Is it really the case that teaching science in schools is doomed to failure? Could we not continue working on curriculum innovations instead of giving up and heading to the Science Center IMAX? If the latter, why drag students through school in the first place?

Ira’s other big message was the need for effective science communication. He illustrated this point with one of my favorite examples, a video of Grace Hopper explaining what a nanosecond is to David Letterman (apparently CBS has yanked all copies of this video clip — very sad!).

He also noted that Neil deGrasse Tyson will host a new Cosmos show starting next year (woo hoo!).

Okay. Science is sexy.

Program or be programmed

I read Douglas Rushkoff’s book, Program or Be Programmed with a mixture of fascination and criticism. I didn’t agree with every argument (e.g., that computer networks have no notion of time; many internet protocols use timestamps to ensure reliable communication), but each chapter gave me something to wrestle with mentally, and the book as a whole made me see various aspects of my life (interacting with technology) in a new light. Rushkoff’s thesis takes a historical view of how new technology penetrates society gradually, and those who develop the ability to manipulate and create, rather than just to use and consume, are the ones in control. Arguing from examples based on the development of writing, print, and electronic media, he notes that for us today, it’s the ability to program that gives us control over the new technological world, and that (somewhat chillingly) willful or accidental ignorance about the motives of Those Who Program may cause you to execute their Program without even knowing it.

This great, short video lets Rushkoff summarize his points in two minutes flat:

I am already a “programmer,” in that I have programming skills, but even so I consume most of what’s on the net as a user, rather than getting out there and being actively involved myself. Programming is what I do at work. On the other hand, I’ll never forget the thrill I experienced when I first contributed to an Open Source project. My art, my creation, uploaded into the ether after building on, complementing, and extending the work of complete strangers! And who knew where others might take it! It was like Free Love, but in C.

But after reading his book, I couldn’t help but think a while about what built-in biases about how various technologies work are shaping my own thoughts, habits, and ability to create.

This point, however, is the tenth of his 10 commandments. The earlier ones have value too; it never hurts to get another reminder of the value of not always being “on”/”connected,” and of being present in the here and the now.

Will humans ever go to Mars?

I get asked that question a lot. I end up giving two answers: my own wishful dreams, and the less inspiring view of what I think might actually happen.

I recently came across a thoughtful article that agrees with my complicated views on the subject very well. It’s titled “Mission to Mars: Will America Lose the Next Frontier?” After noting the merits of the MSL rover, the article points out the downside of the project: by going almost $1B over its initial cost estimate, MSL has forced the delay or cancellation of other Mars endeavors. (I believe that the article’s note about the cancellation of the Mars 2016 mission is a reference to the 2018 MAX-C mission, a step on the path to sample return, which was canceled. We do have a mission slated for 2016, announced after the article’s publication: the Mars Insight lander.) Similarly, the article notes the terrible impact that the James Webb Space Telescope has had on NASA’s astrophysics program. JWST is NASA’s poster child for mind-blowing cost overruns. Initially estimated at $500M, it’s grown by leaps and bounds and is now estimated at $8B. Both MSL and JWST are sure to deliver rich scientific gains in their respective missions. However, I think this article is correct and fair to note the other efforts that have fallen by the wayside to ensure that these projects are complete.

The main message of the article, however, is the bigger view on what this means in terms of larger, longer-term goals:

“But today, thanks to a combination of budgetary stress, regulatory overkill, and an unfortunate lack of political skill at the highest levels of NASA, the Mars exploration program is in deep trouble. It may be a very long time before the U.S. space agency launches another significant Mars mission.”

Put simply, NASA doesn’t have the budget to send humans to Mars. “Regulatory overkill” refers to a strict intolerance of any NASA failure, no matter how large or small, which necessitates over-engineering (and ballooning costs). Unless something dramatic changes in NASA leadership, political weight, or budgetary windfall, it’s unlikely that our space agency is going to get us there. But all is not lost; Elon Musk is on the job.

Why one enlisted in 1917

One of my duties at the Monrovia Library is to take old newspapers on microfilm and scan them into electronic files. We anticipate this making them much easier for patrons to use, and it will mean less wear on the microfilm itself.

I’ve been working on scanning the Monrovia Weekly News from 1915-1917 lately, and sometimes my attention is caught by unusual ads or articles. This item, from May 26, 1917, definitely stood out.

For context, what was happening in 1917? That’s right, World War I (at the time, the Great War). The U.S. had declared war on Germany just seven weeks earlier, on April 6, 1917. Before it was over, we’d lose 116,000 U.S. lives.

And the straight-shouldered grandfather? He’d have possibly fought in the Civil War, 56 years earlier. A bit more complicated, that, to consider it an answer to the call of the Flag (presumably the North and nationalism, vs. the South and federalism).

Regardless, a sobering take on conscription and enlistment. Does our Flag have the same call today?

« Newer entries · Older entries »