Michael McCullough is a professor of psychology at University of California, San Diego, where he runs the Evolution and Human Behavior Lab. Mike and I had a chat about his new book, "The Kindness of Strangers." The title for that book as originally conceived was "Why We Give a Damn" -- and even prior to that "Why We Don't Give a Damn." I happen to like those titles, though I can understand why the publisher didn't, and so I thought I'd trot them out to have a modest life of their own. In this conversation, we talk about Mike's first inspiration to study psychology, the influence of Christianity on his personal development and later his study of religion, his approach to mentorship, where he thought the conversation surrounding the biological basis of altruism went wrong, and rethinking the parable of the Good Samaritan.


Notes from a PhD student who doesn't plan to pursue academia.

A couple of weeks ago, Mickey Inzlicht and Yoel Inbar, of the excellent podcast Two Psychologists Four Beers, released a discussion of theirs called 'Against Academia?' The motivation for the discussion was that Mickey had noticed several occasions on which people -- one of them being himself -- were called out for expressing positive takes on life in academia. The mainstream position to hold on academic life is a negative one: it's a biased system, a pyramid scheme, a travesty for mental health, etc. They break down a number of considerations on both sides, digging into where some of these claims are identifying something real and important and other areas where they are exaggerations. Both Mickey and Yoel are tenured professors at an excellent university, and they acknowledge that their view is one of the better ones you can get in an academic appointment. It's a really useful conversation that brings up a lot of worthwhile, well-considered points. I'd like to add to it by responding to it from my own perspective: as someone currently doing a PhD who does not plan to pursue an academic career.


One of the premises of Mickey and Yoel's conversation is an assumption about why people go into academia. It is a common assumption to make in these sort of discussions. The assumption is essentially that before would-be academics pursue an academic career, they perform a sort of rational analysis of whether it in fact makes sense to follow the path of academia. The main factors in this analysis are the allure of tenure, the probability of actually getting it, and how much one enjoys one's scholarly interests. All of this is weighed up and compared to other potential paths, and the choice to pursue a PhD means that one has opted for training in the ancient ways of the scholarly arts. This is why Mickey and Yoel engage in long discussions of whether tenure is really that great, and what the main predictors of success are for job applicants; they assume it's a crucial factor in the decision-matrix.


I think they're wrong.


Instead, all of those points are mostly post-hoc rationalization. Essentially: us graduate students -- and I do include myself in all of this analysis -- find ourselves in graduate school, someone asks us why we're there, and we give them a story about the promised land of tenure and how great it is to be able to set one's intellectual agenda. The main problem here is that the decisions we're discussing -- whether it makes sense to attend grad school given the low probability of getting a job -- are not the decisions that people are actually making. And I think our discussion would change in important ways if we were more realistic about what's actually going on.


Part of the motivation for this argument is on the principles of behavioral science. Humans are capable of employing tools of rationality, but it's usually not the first one we reach for. We often make decisions, then come up with good reason for them later on. This is a classic find of social psychology. Two papers I really like along these lines are Nisbett and Wilson's (1977) Telling More than We Can Know and Fiery Cushman's (2020) Rationalization is Rational.


But the more important part of the motivation for this argument is that in the vast, vast majority of cases there's simply no possible way to do a rational analysis of what one should do one's career and have the dial land on "academia." Undoubtedly there are some people who are truly cut out for an academic career as it currently exists -- where their natural dispositions are perfectly in line with what an academic job requires. And it's also no doubt the case that there are many of us who would legitimately make good professors: we would mentor our students well, teach engaging courses, and do interesting research. But there's no way that everyone who does a PhD, or even a majority, does so because it is the One True Path on which they can do what they enjoy and have it make a positive impact in the world.


Half of the equation here is about the alternatives to academia. And it's a basic fact that in other kinds of jobs you are generally afforded the opportunity to: make more money, choose where you want to live, reserve parts of your week as time where you don't have to be concerned about work, and enjoy other benefits that are major contributors to one's well-being. I feel comfortable making these assertions without data, because these are things that academia simply doesn't offer. Other jobs do.


That academia doesn't offer these things is crucial. I worked for two years in the Harvard psych department as a lab manager, and one of the things I was taken aback to discover was just how miserable the untenured professors seemed. I have in mind a handful of excellent human beings, who seemed profoundly more unhappy than they needed to be. Tenure was being held over their heads as a sort of devil's treat, always requiring just a little bit more than what one was currently giving. It seemed to provoke a sense of all-round inadequacy: that one ought to be doing more for one's family, one's students, one's grants, one's responsibilities to the profession, one's research. I point this out not to draw attention to suggest anyone was in fact inadequate, but because these people are actually the most adequate that it is possible to be. A tenure track job at Harvard: this is literally the best case scenario for a would-be academic. If you offered their position to any grad student or post doc, they would probably be thrilled to take it. And yet it didn't seem like the liking of that prize was commensurate with the wanting.


This is one small anecdotal observation that's part of a larger picture. The probability of getting a job in academia, which is by definition the minimum requirement of a career track, is insanely low. The chances are that if you are currently setting out toward getting a PhD, no matter how brilliant and talented you are, you are most likely not going to secure an academic job. And that's just a job, let alone the job you actually want -- in a place where you'd like to live, in the kind of department you feel best suited for, at the kind of institution you might hope to be at. One of the major points of discourse for Mickey and Yoel is to determine just how bad the picture is. And that's certainly a scene worth sketching out. But, in an important way, it also misses the point. Demand for academic appointments of PhDs is going down, yet the number of PhDs minted every year continues to rise. People aren't paying attention to this part of the incentive structure. Why?


One reason could be that people just don't know what they're getting themselves into. Perhaps back in the day potential academics weren't apprised of the dire job market before embarking on their PhD. But I just don't think that's true today. For one thing, everyone talks about it. How many times have you heard this leveraged as common point of commiseration in grad student conversations over a round of pints? I'd wager the N is large enough to be significant. The other thing is that it has been a problem for so long (about fifty years). The cat's gotta be out of the bag.


The current academic job crisis is a product of what is known in the relevant literature as the "Golden Age" of American higher education [1]. This Golden Age ran for the three decades after World War II: roughly, mid-1940s through mid-1970s. It was defined by explosive, unrepeatable growth in the numbers of people attending university. The kick-off of this Age was the G.I. Bill, signed by President Roosevelt in 1944, which among other things paid for the educational costs of returning veterans to attend college. It doubled the number of students attending university. They needed to hire more faculty to teach them.


Another activity that vets engaged in upon their return, besides educational attainment, was having kids. This was the Baby Boom, which hit universities in the 1960s. These decades were also a time of massive and widespread economic prosperity in America. It was also generally believed that American institutions of higher learning were good places to further the ideals of democracy (both through ideological means, as well as for developing technologies to blow up Communists). The overall point is that this Golden Age is when the rules of the contemporary publish-or-perish academic marketplace were set. The university system adapted to the needs of that era. It never quite adjusted for the period after. Would-be academic faculty have been dealing with the fallout for the fifty years since; we aren't the first generation to face this problem.


One argument that could be made is, even knowing all this, that once you get tenure it is worth it. To me, this argument is plainly insane. For one thing, is that really how you think human behavior works? You spend the first forty years of your life working in one particular mode, and then you get tenure and suddenly adopt an entirely different mode of operation? No way. You're setting crucial habits and life-precedents in those early years, and regardless of your job status later on you're going to be dealing with the consequences of those precedents. I'm sure tenure is swell. But I'm also sure that tenure doesn't wipe your slate clean, liberating you from every decision you've ever made. No, it earns you more freedom within the framework you've already set for yourself. How much you like that framework to begin with is approximately how much you're going to like that framework once it's set in stone. (Exhibit A, Mickey Inzlicht: is happy with his job post-tenure, was also pretty happy with his job pre-tenure.)


But let's say you could architect an immediate paradigm shift in your habits. Would it be worth it to slavishly sacrifice your twenties and thirties just to have a certainly level of job security? It's really hard to figure out how the answer is 'yes.' Let's face it: You're doing a PhD. You're smart. You've got skills. While job security can be an issue in society in general, it tends not to be for people who have your level of smarts and talents. Someone is going to want you to do something for them. It'd be great to have a lifetime guarantee of what exactly that is. But it's a nice-to-have, not a need-to-have. It just doesn't seem like job security is what's going to be your main stumbling block in life. For example, in my interview with Don Norman, who has spent time in both academia and industry, he went off about how in tech you actually make more money by getting fired because you get a new position with a higher salary. Take that claim as you will. The point is: why on earth would job security be the thing you covet most? It shouldn't be your main concern. What should be? Living a life that best uses your gifts and allows you to invest in the things you think are important.

Let's just look at one more aspect of the job security claim. Sure, if you get tenure, that gives you maximum, 100% job security. But at literally every step along the way, your job security is one of the lowest that a person of your skills can conceivably have! You spend five years getting paid 1.5x min wage for your PhD, then beg for a postdoc that buys you another 2 years at 2x min wage. Then the cliff drops off and you sweat it out for two or three years trying to find that first academic appointment. That is the opposite of job security. What you have is guaranteed job insecurity.


To give another edge case, take Steven Pinker. In my interview with him from earlier this year, one of the major themes of that conversation was how insecure he was about his job prospects early in his career. This is one of the most prominent, most influential people in the behavioral sciences, and he still went through a period of feeling the same thing any of us trying to make it in academia currently feel. Now, it's a separate empirical question as to whether his job-hunt experience actually was, in fact, as insecure as someone today. It's probably unnecessary to point out how things have gotten harder since then, or any privilege that Pinker had going for him. The critical thing here is that he still felt the same sense of insecurity. This again works as a kind of upper bound, the best case scenario. This is the person is most secure later in his career; early in his career he was still insecure.

But, you may say, if we go into a real-world job, won't we have to do things we don't want to? I think a potentially useful way to answer this question is to spend a week or two keeping close track of how you actually spend your time. What percentage of your time is spent doing dull tasks which fail to spark joy? Do you suppose that that number goes up or down if you take a day job? Do you think that number outweighs all of the other attributes that job will bring: allowing to leave your work at work, increased pay, more time with family? I don't know the actual numbers. But they would have to be pretty spectacular in order to get a favorable answer to those questions.

So it's hard for me to see how any of us got here by asking ourselves about the optimal course of action. Alright then. How did we get we get here?


I believe the reason why most of us -- and I include myself in this number -- pursue a PhD is lack of imagination. This lack of imagination isn't specific to people doing PhDs. It is a facet of humans more generally. Once we are set along a specific track, it's remarkably difficult for us to imagine an alternative world. What academia offers, then, is not an especially alluring track, but a well-defined one.


It's common for bright young students to get into research during undergrad. I remember asking myself as an undergrad at UCLA, "What is the thing I can get out of my time at this university that I cannot get anywhere else?" One of my main answers was to get involved in world-class research. But you don't even have to be good at research per se. All you have to do is be really good at taking tests (which, by the way, has literally nothing to do with the skills you'll be assessed on later in your academic career). You finish school without a clear idea of what to do next. What do you do? Easy. More school. With every passing semester it becomes increasingly difficult to imagine oneself on an alternate track as a lawyer or an engineer or an interior designer or a baker. Again, this is not specific to people doing a PhD. But people doing PhD are especially prone to it because academia is one of the few modern vocations that offers a clear-cut path: undergrad, lab manager/masters, PhD, post-doc, untenured faculty, tenure. There are also clear (if ridiculously competitive) rules for getting from one stage to the next. Keeping the train going on the academic track is, in a paradoxical way, the ultimate minimization of uncertainty. The board is laid out, the rules are set, the goal is clear; you just have to make the right moves. It's like chess. The game might be hard. But at least it keeps you out of the partially-observable Markov decision process.


This brings us back to the prospect of rational analysis. We do perform an analysis of this kind, in a way. It isn't: weigh up all the pros and cons of academia, compare them to the space of possible alternatives. Instead, the path feels right because you know approximately what it will look like. You see the end game. You can visualize success. There's the promise of eminence, of being well-regarded for what you do. There's the promise of doing something you enjoy. There's the minimization of track-uncertainty, of the confidence that if you put yourself in a game a chess you're clever enough to win. And, most crucially of all, there are insufficient data about what exists out there in the rest of the world. You attend college alongside other people who are attending college, taught by people who never really left. How are you supposed to know what else there is for people do? The only wonder is that anyone ever feels they ought to leave.


The good news is that if this theory is right in some meaningful way, it suggests an optimal solution. It is this: by all means, do your PhD. But absolutely enjoy the hell out of it.


Pursue the PhD for the best parts of it, not the worst. The best are about following your own instinctive curiosity; the worse are about architecting a career as an academic. Treat the damn thing like you would a class in second semester senior year. Take it, but don't worry about the final grade. You're not going to be judged by the panel -- filled with excruciatingly harsh Russian judges as it is -- who tend to look at those things. Follow your nose. Do enough to finish, but don't compare your CV accomplishments to your peers. Measure your success by how thoroughly you're following your own interests. Open yourself back up to new and exciting things. Find that original sense of wonder that got you into this game in the first place. Most importantly, under no circumstances should you claim that you know what you're going to do next.


The world is much larger than you imagine it to be, and the space of things that you would love doing is mostly filled with things you don't yet know exist. This goes for both specific jobs, as well as kinds of jobs.

Suppose that instead of picking a specific outcome (which you have selected almost entirely arbitrarily) and trying desperately to compete for one of a handful of opportunities to do it, you spent your time doing what you love to do and letting that form your path. Instead of trying to game the system and swim in the shark infested waters of publish-or-perish, you actually allocate your time according to what brings you the most joy.


If you spent five years of your PhD doing that (assuming you do enough to graduate) you will have had half a decades worth of experience in the thing that you most enjoy doing, presumably building up a set of skills and a nice little portfolio. Regardless of what this is, how would it not make you an attractive candidate for someone somewhere who is looking for someone to do that thing, even if you don't yet know who they are? If that thing turns out to be publishing papers in your field, great. But if it happens to be data analysis, or writing, or clear presentation of ideas, or even something not ostensibly academic, like building relationships or setting up technological infrastructure, then there's also going to be a place for you. And guess what, it's going to be a place specially designed to maximize the thing you most enjoy doing -- which, not by coincidence, is also probably going to be the thing you're best at doing. This is a bottom up strategy for making a career, rather than a top-down one, and whether you realize it or not it's the only one anyone can actually follow if she expects to find some magic along the way. (I like the way Linda B. Smith puts it; she calls it "making the best local decision.") There is so much more that's fantastic and interesting and enjoyable and profitable in the world than you can currently imagine. Instead of succumbing to that ignorance, I think we should leverage it.


So am I actually "against academia?" Not really. I have a couple years left in my program, and I'm going to try my damnedest to kick some ass in my research. If I succeed in doing that, I'll try for a tenure-track job. But I'm not married to the academic track. It's not how I define myself. And though it's hard, I try to remind myself that the goal of all this is not to be good at my PhD. It is to be good at my life. There are many ways to do that.


---


[1] For a really interesting analysis of trends in American higher education, see Louis Menand's (2010) The Marketplace of Ideas. My claims on post-war higher ed in America are drawn from Chapter 2.

Howard Gardner is the John H. and Elisabeth A. Hobbs Research Professor of Cognition and Education at the Harvard Graduate School of Education. He's best known as the developer of the theory of multiple intelligences, the idea that being smart is more complex than just an IQ score. That theory was introduced in his 1983 book Frames of Mind. In this conversation we talk about his interdisciplinary education, his non-traditional path in academia, the principles of the "synthesizing mind," when to break the rules and when to follow them, and how to connect with one's deeper humanity. Howard's latest book, A Synthesizing Mind, is out now.


Inspired by Love. Guided by Knowledge.

  • Twitter Social Icon
  • Instagram Social Icon

"The good life is one inspired by love and guided by knowledge."

-Bertrand Russell