Friday, December 27, 2013

Technocracy. It's new.

No. It's not.

I've read a number of recent books and articles about how technology, particularly computers and robots, will change everything and create a bipartite society, where "there will be those who tell computers what to do, and those who are told what to do by computers" – in a compact form. (As a computer engineer, I sort of approve of this message. :-)

This idea of a bipartite society with a small elite lording over the undifferentiated masses is not new (really, not new at all). That it's a result of technology instead of divine intervention or application of force is also not new, but, since most people have an "everything that happened before me is irrelevant because my birth was the most important event in the totality of space-time" attitude towards the past, this is ignored.

There are a few reasons contributing to the popularity of this idea:

It's mostly right, and in a highly visible way. Technological change makes life harder for those who fail to adapt to it. In the case of better robotics and smarter computers, adaptation is more difficult than it was for other changes like the production line or electricity. One way to see this is to see how previously personalized services have been first productized (ex: going from real customer service representatives to people following an interactive script on a computer) and then the production processes were automated (ex: from script-following humans to voice-recognition speech interfaces to computers). Technological change is real, it's important, and it's been a constant for a long time now.

(Added Jan. 7, 2014: Yes, I understand that the economics of technology adoption have a lot to do with things other than technology, namely broader labor and economic policies. I have teaching exercises for the specific purpose of making that point to execs and MBAs. Because discussion of these topics touches the boundaries of politics, I keep them out of my blog.)

It's partially wrong, but in a non-obvious way. People adapt and new careers appear that weren't possible before; there are skilled jobs available, only the people who write books/punditize/etc don't understand them; and humans are social animals. The reason why these are non-obvious, in order: it's hard to forecast evolution of the use of a technology; people with "knowledge work" jobs don't get Mike Rowe's point about skilled manual labor; most people don't realize how social they are.

(On top of these sociological reasons there's a basic point of product engineering that most authors/pundits/etc don't get, as they're not product engineers themselves: a prototype or technology demonstrator working in laboratory conditions or very limited and specific circumstances is a far cry from a product fitting with the existing infrastructure at large and usable by an average customer. Ignoring this difference leads authors/pundits/etc to over-estimate the speed of technological change and therefore the capacity of regular people to adapt to it.)

Change sells. There's really a very small market for "work hard and consume less than you produce" advice, for two reasons. First, people who are likely to take that advice already know it. Second, most people want a shortcut or an edge; if all that matters is change, that's a shortcut (no need to learn what others have spent time learning) and gives the audience an edge over other people who didn't get the message.

It appeals to the chattering classes. The chattering classes tend to see themselves as the elite (mostly incorrectly, in the long term, especially for information technologies) and therefore the idea that technology will cement their ascendancy over the rest of the population appeals to them. That they don't, in general, understand the technologies, is beyond their grasp.

It appeals to the creators of these technologies.
 Obviously so, as they are hailed as the creators of the new order. And since these tend to be successful people whom some/many others want to understand or imitate, there's a ready market for books/tv/consulting. Interestingly enough, most of the writers, pundits, etc, especially the more successful ones, are barely conversant with the technical foundations of the technologies. Hence the constant reference to unimportant details and biographical information.

It appeals to those who are failing. It suggests that one's problems come from outside, from change that is being imposed on them. Therefore failure is not the result of goofing off in school, going to work under the influence of mind-altering substances, lack of self-control, the uselessness of a degree in Narcissism Studies from Givusyourstudentloans U.  No, it's someone else's fault. Don't bother with STEM, business, or learning a useful skill. Above all, don't do anything that might harm your self-esteem, like taking a technical MOOC with grades.

It appeals to those in power. First, it justifies the existence of a class of people who deserve to have power over others. Second, it describes a social problem that can only be solved by the application of power: since structural change creates a permanent underclass, not by their fault, wealth must be redistributed for the common good. Third, it readily identifies the class of people who must be punished/taxed: the creators of these technologies, who also create new sources of wealth to be taxed. Fourth, it absolves those in power from responsibility, since it's technology, not policy that is to blame. Fifth, it suggests that technology and other agents of change should be brought under the control of the powerful, since they can wreak such havoc in society.

To be clear, technology changes society and has been doing so since fire, the wheel, agriculture, writing,  – skipping ahead – printing press, systematic experiments, the production line, electricity, DNA testing, selfies... The changes these technologies have brought are now integrated in the way we view the world, making them so "obvious" that they don't really count. Or do they? Maybe "we" should do some research. If these changes were obvious, certainly they were accurately predicted at the time. Like "we" are doing now with robots and AI.

You can find paper books about these changes on your local sky library dirigible, which you reach with your nuclear-powered Plymouth flying car, wearing your metal fabric onesie with a zipper on your shoulder, right after getting your weekly nutrition pill. You can listen to one of three channels bringing you music via telephone wires, from the best orchestras in Philadelphia and St. Louis while you read.

Or you can look up older predictions using Google on your iPhone, while you walk in wool socks and leather shoes to drink coffee brewed in the same manner as in 1900. The price changed, though. It's much cheaper to make, but you pay a lot more for the ambiance.

Think about that last word.

Friday, December 6, 2013

Word salad of scientific jargon

"The scientists that I respect are scientists who work hard to be understood, to use language clearly, to use words correctly, and to understand what is going on. We have been subjected to a kind of word salad of scientific jargon, used out of context, inappropriately, apparently uncomprehendingly." – Richard Dawkins, in the video Dangerous Ideas - Deepak Chopra and Richard Dawkins, approximately 27 minutes in.

That's how I feel about a lot of technical communications: conference panels, presentations, and articles.  An observed regularity: the better the researchers, the less they tend to go into "word salad of scientific jargon" mode.

Wednesday, November 27, 2013

Intellectual counterfeit fashionistas and the corruption of STEM and analytics

I have acquaintances who say they like classical music but never listen to it and can't tell Bach from Brahms. While this is entertaining to classical music aficionados, a similar disconnect happens in STEM and business analytics, where it has serious consequences.

I've observed many people who are always saying how important science is, who can name several recent Nobel laureates in the sciences, but can't compute the kinetic energy of a 2-ton SUV going 65MPH (766kJ), or, ironically, can't explain what the research of those Nobel laureates was about.

I know people who are always talking about Big Data™ and "the" Management Information Revolution™ (yes, they think the current one is the only one), but cannot write Bayes's formula and think that standard deviation is the same as standard error.

These are the signs of the rise of the intellectual counterfeit fashionista (ICF). The ICF wants others to consider him or her an intellectual (that's the I), up to date on the latest hottest intellectual topic (that's the F), but is not willing to do the work and the learning necessary to understand that topic (that's the C).

No matter how infuriating or entertaining an ICF can be on a personal level, their rise is a problem -- chiefly because of their effect on education, the practice of technical professions, and the general perception of STEM and analytics in society.

Education: by trying to recruit proto-ICFs into STEM/analytics, teaching institutions end up having to water down their courses, since the ICFs don't want to do the work needed for real learning. This leads to lower quality education for every student, even the non-ICFs.

In the mid-to-long term, this creates a number of credentialed ignoramuses and gives rise to the strange situation where people who hire engineers say there's a dearth of them, while engineering associations say there's a glut. I guess it depends on how you define engineer, by skills or by credentials.

Professions: the obvious effect of ICFs is the rise in average incompetence. The more pernicious effect is the destructive nature of internal politics, which always increase in organizations with large numbers of people for which appearances and narratives are more important than observable realities and hard work.

I wish nerdiness became unfashionable again, so that the ICFs moved on to corrupt something else and left STEM and analytics alone.

Sunday, November 24, 2013

Carrying less to do more

Every so often I look back at a packing list from some years ago, and find myself flabbergasted at how much simpler travel has been made by technological advance and some judicious choices.

This is all the hardware (plus cell phone, mine being a prepaid for emergencies only) I carry on a work trip:

Laptop
+ Power brick
+ VGA adapter
+ Presentation remote (with green laser and 4GB drive)

iPod Touch 5
+ Charger cable
+ audiophile earphones
+ sports earphones

Backup hard drive (2TB of space, mostly filled with optional content for work & downtime)
+ USB 3 cable

Large capacity USB flash drives (including a 32GB one on my keychain!)

Rite-in-the-Rain notebook
+ Fisher space pen

Microfleece cleaning tissues doubling as packing material.

Add a magazine to read when electronic devices aren't allowed (I get The Tech and Smithsonian Magazine for free, so I take those and dispose of them when done), clothing (planning helps), toiletries, and food for travel.

The magic enabling the ever shrinking ever more powerful hardware packing comes from multitaskers and digital content.

The iPod Touch replaces a lot of equipment I previously carried (iPod, still camera, video camera, voice recorder, backup remote control for presentation, books to read, and even my iPad 1.0 in many respects). The hard drive carries an hitherto unthinkable library of work and play stuff. (I don't play computer games, other than the occasional solitaire, bejeweled, or mahjong, so I don't carry -- or own -- a game controller.)

The second part of the magic is the move to digital content.

Many years ago I'd carry a small sleeve case with CDs for my Sony Discman (Get off my lawn, kids!!!), some DVDs, paperback books, work books -- hardcover textbooks! -- and other heavy objects with minimal bits-to-atoms ratio. Now I carry thousands of music tracks, hundreds of books, audiobooks, and technical papers, dozens of movies, videos, and television shows, and even a few comic books for nostalgia sake, all as bits on the hard drive. (Obviously these are not the only copies I have of those bits.)

Anything important is backed up in a multiplicity of places: laptop hard drive, portable hard drive, USB flash drives, multiple online services. Because it's well known that anything important of which you only have one copy will, by the laws of Physics, necessarily be lost, inoperative, or confiscated by the TSA.

Of course, you still need to bring a few changes of clothes and toiletries. There's no digitizing those.

Thursday, November 21, 2013

The roots of my disillusionment with 'official skeptics'

There were several contributing events, all similar in one point: 'official' skeptics prove to be so in name but not in actuality. This is one of the events, involving James Randi, whom I still admire.

James Randi had a long feud with Uri Geller regarding spoon bending. Now, I used to do a lot of spoon bending myself before I got a OXO Good Grips ice-cream scoop, but that's not the type of spoon bending that got Mssrs Randi and Geller at loggerheads.

Mr Geller claimed he had paranormal powers, which he demonstrated by bending spoons. Mr. Randi implied (for legal reasons he couldn't outright state) that Mr Geller was in fact using prestidigitation. (For a moment ignore the obvious question of why someone with paranormal powers would use them to bend eating utensils instead of, say, make a fortune on Wall St.) You'd think that Mr. Randi would explain how the trick is done, so that the audience could check whether Mr. Geller was in fact using that trick.

No. Mr. Randi invoked the Magician's Code and declined to explain how the trick is done. (FYI: you bend the spoon with finger pressure or against a table, takes a bit of practice to do it without other people noticing, and even with practice they will notice if they're looking for it.) So, here is Mr. Randi, allegedly a skeptic, asking his audience to accept on faith that there exists such a trick that Mr Geller could be using.

When Mr. Randi replicated his great feat of spoon bending, allegedly using a trick, Mr. Geller took advantage of Mr Randi's adherence to the Magician Code to say that Mr. Randi was in fact using his -- Randi's -- paranormal powers. All because Mr. Randi's argument relied on the audience's faith, not a testable proposition.

Now, that's ironic.

-- -- -- --
Note: this vignette was part of the post "Fed up with 'trust us, we're experts' science," but it detracted from the point of that post so I separated it into its own post.

Wednesday, November 20, 2013

Fed up with "trust us, we're experts" science

Somehow in my lifetime we went from Feyman's idea of science requiring 'a belief in the fallibility of experts,' to a caste system where science experts must be trusted without question, and acolytes jump on anyone who dares ask anything.

The trigger event for this rant was the Mythbusters Breaking Bad Special. In particular, the test of the hydrofluoric acid disposal of a body in a bathtub that ends up with a big hole on the floor and ceiling of Jesse's home. (Season 1, Episode 2, "Cat's in the bag.")

(Big Breaking Bad fan here, and still grudgingly a fan of the Mythbusters.)

First off, the Mythbusters test the effect of the 100ml of hydrofluoric acid on a number of samples of the materials involved (meat, wood, drywall, iron, steel, linoleum), all of the same size. Yes, size, not appropriate mass computed from molar calculation. Apparently no one thought of asking a chemist (though one is present to run the experiment) about mass balance and stoichiometry. 

After they fail to dissolve these objects with the apparently arbitrarily chosen volume of hydrofluoric acid, the Mythbusters move on to replicate the scene in the show with a different solvent.

This is the point when I really lose it: they say that the solution to the body-disposal problem is to use sulfuric acid and a secret sauce.

A. Secret. Sauce.

Because knowledge should only be held by experts?! Say whaaa?

This is what science entertainment teaches its audience: if you're not an expert, you should not expect full information: "Trust us, we know what's going on, and you'll get to see the result on TV, so it's real." Of course this trains audiences to (a) accept TV as the authority on who's an expert; (b) believe in experts' statements without requiring proof or independent verification; and (c) think of science as something beyond the comprehension of the audience member, and therefore not to be questioned by him or her.

Yes, I get their legalistic "we're not here to teach people how to dispose of bodies," but it's ridiculous: acquiring the large quantities of acid necessary would be more suspicious than a number of other ways that can easily be found on the interwebs or on Bones or Dexter. Joe Pesci explains the traditional approach at the beginning of Casino: "dig the hole before you whack the guy, so you don't have to dig it with the body out in the open."

(The secret sauce is hydrogen peroxide, another chemical that would really raise eyebrows -- FBI and DHS eyebrows -- if purchased in quantity, since it is used for improvised explosive devices. Also, really really really temperamental chemical.)

Then I remembered the Mythbusters had done this before, in the thermite episode, for which they blurred the names of the igniter reagents. FYI,  to ignite thermite you drop glycerol on a mound of potassium permanganate on top of the thermite; though you can simply use a long-neck torch, like they did on, oh irony, Breaking Bad.

When I was a kid, I liked chemistry almost as much as electronics, and this is the kind of thing we got to play with before the world became full of Sitzpinkler. Do they even sell chemistry sets for children anymore? If not, where is the next generation of chemists and chemical engineers going to come from? Chemistry can be dangerous, but bringing up an entire generation ignorant of it is terminally stupid. But I digress...

Back to the main problem: It has become acceptable to make the argument that the audience should trust the experts on faith, since the technical stuff is either too difficult or too dangerous or too easily misused by the non-initiated.

This kind of thinking is more dangerous to science than 10 Tomás de Torquemadas. Because this is the kind of thinking that creates 10,000 Torquemadas, all convinced that they are the paladins of science and all ready to auto-da-fé those whom the experts deem to be the enemies of Science™. Thus quelling dissent and killing the basis of all progress in science.

A lot of people will line up for this; after all there are many people who like the idea and image of science. As long as they don't have to learn any, of course.

-- -- -- --
Note: edited on Nov 21st to remove unnecessary detour about "skeptics."

Friday, November 8, 2013

Thoughts inspired by a science joke

Another day, another science joke. Not a very funny one, but enlightening.

When I say "science joke," I mean one that involves a modicum of science knowledge. Which makes this yet another post against the scientistologists that are all in favor of science as long as they don't have to learn any. They like the idea and the image of science, but are not willing to do the work necessary to learn it.

Last sunday I tweeted: According to my alarm clock, the computer & phone spent two hours moving at almost 90% of speed of light. That's one explanation.

Since that was the end of Daylight Savings Time, what that tweet says is that clocks which get a synchronization signal from the internet were one hour behind those that I have to reset manually. The twist is that I calculated what speed would compress time 1:2, $v = 0.8660254 c$, and included that in the joke.

(By the way, this time compression is an example of the twins "paradox," which is not paradoxical at all.)

As for the people who "love science" (as long as they don't have to learn any), well, many of them have a vague notion that I was referring to relativity, but no idea whether the 90% number was right, wrong, or random. Science is something they believe in, without actually knowing any of the details.

More and more people are falling into this trap of believing in science as opposed to actually learning it. That is a very bad trend in a technology-dependent society.

Thursday, November 7, 2013

Twitter valuation is a bet on network value

No, I don't think Twitter is prima facie over-valued.

The following graphic from the WSJ (reproduced here because deep linking is discouraged) has been making the rounds, generally in support of the idea that Twitter's valuation is yet another finance mistake:



But here's the funny thing. Note how both LinkedIn and Twitter are apparently over-priced, and suddenly an alternative explanation appears: the market understands that, while right now the revenue models of these companies are not good, there is value in their networks that, either directly through advertising and other attention-monetizing strategies, or indirectly via the information value of the network, will eventually be captured. (Even if that requires a change of management, which sometimes it does.)

It's a bet on the future value of networks and their associated preference and communication data.*

As I mentioned in my post about the Skype acquisition, these companies are not just some black-box generators of revenue. In particular Twitter's resources include:
  • Knowledge of the network to a level of detail that can be closed off to outsiders.
  • Personnel and technology that allow for exploitation of the knowledge in the network; inasmuch as the data and technology have unique features, the personnel and the resources are partially locked into the company and are assets to be taken into account in valuation.
  • An installed base that serves as a barrier to entry to competitors trying to build their network.
So, not being privy to the financial details, I cannot say whether the valuation is right or wrong, but I can certainly say that people who pass judgment on that valuation based on last year's revenue are terminally myopic. Sadly even people whom I respect seem to fall for this trap.

There's gold in those networks and the nerds who can analyze them.

-- -- -- --
* A bet not dissimilar to that of Google trying to build their own social network with Google Plus and all the actions they take in other properties like YouTube trying to nudge people into using the social media affordances of Google Plus instead of the older comments and video responses (now discontinued, get your linkage on G+).

Monday, October 21, 2013

My phone is just as smart as you guys!

Dunning-Kruger Effect, the internet is your multiplier.

Anyone can search for anything, which makes knowing what to search for and how to interpret the results more important than ever. The comoditization of information increases the value of knowledge.

Early on in the most recent episode of The Big Bang Theory (season 7, episode 5, "The Workplace Proximity" *), Amy, Bernadette, and Penny are in Penny's apartment drinking wine and talking about Amy's temporary move to Caltech:

Amy: "I'm leading a study to see if deficiency of the monoamine oxydase enzyme leads to paralyzing fear in monkeys"

[Bernadette lets slip that she might have done that research with death row convicts, which she quickly denies because it would have been unethical.]

Penny: "Not many people know this, but the monoamine oxydase [mispronounced as "oxidize"] enzyme was discovered by a woman, Mary Bernheim.

[Bernadette and Amy are stunned.]

Penny: "That's right. My phone is just as smart as you guys."

And this captures a common confusion between knowledge and information. Note the pathologies illustrated in that vignette:

1. Who discovered MAO is irrelevant for the work Amy will be doing. Like Penny, many people pluck some vaguely related fact from the internet to interject into a discussion, in the illusion that they will appear knowledgeable. This behavior is becoming more and more common, especially with smartphones, but knowledge is a lot more than a simple collection of facts.

2. Penny searches for MAO because someone else brought up the topic. Without a framework of knowledge to integrate facts, people who depend on search don't know what to search for. In other words, the input for a meaningful search requires knowledge.

3. Even if Penny found useful MAO information, for example the mechanism by which it catalyzes the oxidation of monoamines and affects mood, she wouldn't be able to interpret the biochemistry and neuroscience involved. In other words the output of the search only gets meaning through knowledge.

Yes, I understand it's a joke. But this attitude that learning substantive material is passé, made unnecessary by the existence of search engines — an attitude that sadly can be found even among educators — is corrupting, corrosive, and counterproductive.

Without knowledge, information is useless. More people making knowledge-poor searches leads to more random facts being flung haphazardly into discussions; this makes having the knowledge to select and interpret the important facts more valuable than before.

Knowledge is power, the power to use information. Pity so few people know that.

-- -- -- --
* Even though the general arc of the show has become a soap opera, there are still some good jokes in each episode, and the final joke in this one is among the best.

Friday, October 18, 2013

Digging too deeply into a Heisenberg (Physics not crystal meth) joke

Some days ago I saw and retweeted this joke:
Police officer: "Sir, do you realize you were going 67.58 MPH?
Werner Heisenberg: "Oh great. Now I'm lost."
Ok, it's a funny joke, provided you have a passing acquaintance with basic physics.

But here's my problem: a lot of people who kinda-sorta understand that joke have no idea what's really behind it. And that's a problem I've had for a while now with the "science fanclub that cannot do basic science" as I call them. (The people who think that Surely you're joking Mr. Feynman is a physics book and like to watch soap opera biographies of scientists, heavy on the drama, light on the actual science.)

[Added later] My problem with these people is that they perceive science as something that comes from authority and must not be questioned or further investigated by others. For example, they "know" that the position and the velocity of an elementary particle cannot be jointly determined with arbitrary precision; but when pressed about how they know that, they say something about "Cosmos" or mention a Richard Dawkins book (which of course would not cover this); they behave as acolytes to those they recognize as high priests of science, who – presumably – are anointed by a Council of Wise Ones. That's precisely the opposite of what gave science its success, the idea that anyone can question received wisdom and experiment or observation are the ultimate arbiters of correctness. [End of addition.]

A simplified form of Heisenberg's inequality, good enough for our purposes, is

$\qquad \Delta p \, \Delta x \ge h $

Going by orders of magnitude alone, assuming that the mass of Heisenberg plus car is in the order of 1000 kg, and noting that the speed is given to a precision of 0.01 mi/h, an order of magnitude of 10 m/s, with $h \approx 10^{-34}$ Js, we get a $\Delta x$ of the order of

$\qquad \Delta x  \approx \frac{ 10^{-34} }{10 000} = 10^{-38}$ m.

That's a lot of precision to consider oneself lost. For comparison, the width of a typical human hair is in the order of 10-100 micrometers, or $10^{-5}$ to $10^{-4}$ m.

Yes, these numbers show how stupid it would be to use Heisenberg's Uncertainty Principle for macroscopic observations. That's the joke; the fact that many members of the science-fanclub have no idea of the magnitudes involved but like to lord their science-fandom over others is part of my irritation.

I see this all the time in my job, with people who can't write Bayes's formula talking loudly about graphical models (should really be graphal models, BTW, since they are based on graphs, not graphics).

Sunday, July 28, 2013

For better presentations, avoid most presentation advice

If you want to become a better presenter, you probably should avoid most advice about presentations.

Yes, here I am, an educator, apparently telling people to avoid sources of knowledge. The problem is that much presentation advice is not a source of knowledge; more like a source of sophistry that helps perpetuate some of the worst problems with presentations.

As an avid reader of books, articles, and blog posts about presentations, I identified a few pathologies from the mass of material available:

1. Presentationism. This is what I call the tendency of people who do presentation training or information design training to focus on the style and delivery of the presentation instead of the substantive material that the presentation is about. This is a form of professional deformation, but one that can become a serious obstacle to understanding the real value of presentation skills: usually that of changing the audience's mind, unless the presentation is being done for entertainment, legal, or other purposes.

2. Perfectionism. The idea that all presentations have to be done to the standard of excellence and that all presenters should put as much effort as needed into preparing, rehearsing, delivering, and clarifying every presentation. In reality there are many people who have to do presentations with minimal resources, for whom the time and effort required to create a better presentation represent a net loss of value.

3. Ideological purity. Instead of choosing the best tool for a given presentation, many authors are strict ideologues: the presentation should conform to their choice of tool and styles. This affects some famous authors in information design and presentation techniques, and has led to pointless arguments about which tool is better, tout court. Like arguing whether a hammer or a drill is a better tool, independently of the project, and equally pointless. This creates a subordinate problem:

4. Subject matter and audience independence. According to a plurality of authors, Einstein presenting to an audience of Princeton scientists and the Frito-Lay head of sales for northeast Kentucky reporting on the penetration of new chip varieties to a group of mid-level executives should prepare and deliver their presentations in about the same manner, with similar presentation support (typically, though not always, slides), and about the same effort. To be clear, these authors don't suggest that the substance of the presentation should be the same, but rather that the process of preparing and delivering these presentations and the style and design of the materials should be the same.

5. The "tricks and tips" distraction. Many authors offer only tricks and tips, which may be good or bad, but in general create a false sense of learning: the problem with most bad presentations is systemic, not something that a tip will solve. Similarly, a lot of authors use cherry-picked results from psychology to support their approach. As a general rule, unless you can read the original source and determine whether the result applies to your circumstances, it's better to ignore this.


So, what is someone who wants to become a better presenter to do? I've written about it (note the "most" in the title above, which is not "all" on purpose), and here are three further recommendations:

- James Humes's Speak like Churchill, Stand like Lincoln is a short, well thought-out book on public speaking.

- Edward Tufte's books, courses, and web site, despite a bit of ideological purity, are possibly the best source for people for whom getting complex messages across to their audience is important and worth the effort.

- Don Norman's critique of Tufte makes a good counterpoint piece for ET's works.

Above all, think critically about the advice being given; ask "does this make sense in my case?" Even the best advice has exceptions.

Sunday, January 20, 2013

On hiatus

I'll be taking a break from blogging in order to finish a number of writing projects.

I'll probably tweet the occasional pithy thought and post any photos I find interesting. But long-form blogging is unlikely to continue in the previous form; when I return I'll probably be posting book notes or observations about coding in R.