« More From GDC | Main | Serious Games Association Launches »

Mar 27, 2012



Sadly, I find very little to disagree with you on here. You've left qualitative data collection and analysis out of the picture as a doing of something that can have real world (read commodifiable) value, and I take some issue with that. Admittedly, there are more job openings (both in and outside of the academy) for people with strong quantitative skills, but from my perspective that's largely a result of the current cultural fixation on quantification rather than potential practical value of the different forms of data. I also think that the "creating something" category isn't really all that safe in the case of the more artistic endeavors (symphony and novel writing don't usually come with healthcare plans), but that's just me splitting hairs.

In general I have to agree that the picture for higher education is pretty bleak. I consider this to be sad not because I don't think there's room and even need for change in higher education, but because I believe that the changes we're seeing (and will continue to see) are driven by very narrowing commercial interests that are also culturally stultifying in a variety of ways.


I would also tend to agree with the general thrust of this post although I would add the caveat that this is, as mentioned at the start, an American context. While other countries that support universities through government subsidy and granting as a form of public education are less likely to "burst" (I'm Canadian so I tend to believe that we have emerged from the global market turmoil in slightly better shape than most), there will still be consequences for those professors and faculties that don't relate their work to something tangible in the "real world." Our university has "suddenly" become interested in professional practice programs...

The last conversation I had with executive about these issues supported your point about students and funding. I asked which of three areas gave the university the most economic traction - research grants, government funding or tuition. The response (in order) tuition (the cash cow courses, Econ 101, Psych 101 especially, and so on), government funding (about 35% of the total), and a distant 3rd, research and corporate grants or donations.

Likewise, during a 30 year visioning exercise (i.e., "based on an assessment of the dominant factors of the present day, what will X University look like between 2030 and 2040?") The "most viable" response (at least as seen by the President when the dust settled) was a low faculty/high student/content-on-demand/mobile delivered/build-your-credential-from-many-sources model.


"Mere expertise won't keep you alive. Instead, figure out how to do something that the world needs, and flourish!"

Yes, and most likely, flourish in private industry where most of the really exciting opportunities are for research these days.

I would encourage those with dispositions toward career academia to first step out into the scary wide world, learn how to do something very well (or preferably, many somethings), and then, one day--once they've exhausted their learning potential and achievements in industry--to only then consider coming back and teaching the next generation how to do it.

No offense to present company--I say this as someone with quite a few tenure-track family and friends, and nearly all of them ended up there foremost because they had no prospects anywhere else--but the best professors in nearly any field are those who've been there and back again, so to speak. ('Best' defined as what most benefits the students, not just what allows an academic to not get fired.)

On the other hand, kudos to you, Ed, for stating the blindingly obvious when so many of your cohorts still can't face it.


I remember at the first small conference I went to at UCLA years ago, (I think it was the one where I first met you, Ted), there was a fellow who said something along the lines of "All your jobs are gone in the future" to the professors present.

His point, and I think his nascent commercial enterprise, was that the very best lecturers would be recorded and available online to everyone in the world. That ordinary teachers at college level would be replaced by these experts and the everyday grading stuff would be automated and outsourced.

It seems as though something very precious would be lost if the inspirational personal mentoring that can happen with an engaged student is lost but then, hey, look how many people meet and marry through online dating. That would have sounded equally sterile and unpleasant years ago but somehow seems to have become the norm.


I'm worried about qualitative empirical approaches. I understand that in theory they provide important insights. However, two practical matters persuade me to advise students *not* to rely on qualitative research as a breadwinning strategy. First, far too many qualitative reports are only that - reports. At worst, they are politically-motivated reports, hatchet jobs in the service of the radical left. At best, they are diary entries. Because of this general trend, I can say with some confidence that nobody outside the academy lends much credence to qualitative first-person empirical research. And unless something dramatic changes in the fields where qualitative empirical work is taught, I don't see that much hope of improvement. Outside the academy, the papers of the qualitative single-person empiricist school are too often given as evidence of the corruption of professorial practice.

You are free to believe that World of Warcraft represents post-colonial dreams of Amero-Imperialist oppression, but few sensible people out there agree that the "research" leading to that belief makes a person worthy of a tenured professorship at their expense.


I get your point, but I don't think I understand the bubble reference. Are you predicting a bubble in the sense that the investment being made by the current generation of students (i.e. in their education) is going to suddenly burst and send the value of an education into a freefall? I don't see that. I also don't see that there will be a sudden reduction in the value of the government's investment in education, since it is currently subsidizing both private and public institutions. So what is it that's going to explode unexpectedly?


I see now that I'm contradicting myself - talking of bubbles at the start but of mere contraction later.

What makes things bubblicious, to me, is the coordination game nature of higher ed. In the education-as-signaling model, a degree's power as a signal depends on the social assumption that institution X, rather then institution Y, is the screening institution. But there's nothing special about institution X other than its role as being the institution of screening. If the social assumption changes, then the location of social investment changes.

In simple terms, everyone knows that universities are the place where you show the world your awesome chops. Therefore everyone who has chops to show goes to universities. And therefore you can guess whether some new person has decent chops by the fact that he attended a university. But this can all change in a heartbeat. If instead the world believes that Phoenix University Online is where every smart person goes, then it behooves every smart person to go there, and it behooves everyone trying to guess who is smart to lend most credence to the Phoenix University Online degree.

These changes in collective belief can happen very quickly, in the course of a decade or even a year. In 1988, everyone in East Germany assumed that there would always be an East Germany. But East Germany disappeared in 1990.

As for government support: I think my colleagues underestimate the disappointment felt toward universities and their professors by even the most moderate of well-meaning conservatives. I'm thinking here of the reasonable businessman. I'm not at all sure he's willing to keep funding the activities that currently go under the name "university." I would not be surprised at all to see a decade or more of radical reduction in state support for traditional universities in favor of non-traditional (often online) certification systems.


I think you over interpreted me a little bit on my qualitative research comment. I simply meant that skills arising from ethnographic traditions like taking effective field notes and conducting interviews (along with the analytic practices that accompany them) can have value outside of pure research contexts too. They certainly play a major role in user experience research after all, and in fact they are most valuable in that setting when they are coupled with quantitative approaches so as to be able to answer each question with the appropriate methods. As for political applications for such tools, that's a whole different conversation. Any research tool can be (and generally will be) subverted in the context of politics.


Moses, let me respond to your last comment, to the effect that research can and generally will be politicized. I know this is now commonly assumed across many disciplines, but it is a major stumbling block for those on the outside to whom we appeal for support. People outside the academy actually believe that it is possible to perform third party analysis that is not politically distorted. Indeed they insist that it be so. And so they have deep concerns with the approaches of the fields that rely so heavily on Foucault and friends. The now-casual understandihated academics that all claims to objectivity are invalid is perhaps more important than any other factor in explaining why legislators and parents and employers are ready to look elsewhere for their training and certification needs. It's why they demand that we focus only on vocational studies, because we have ourselves said that, in the liberal arts and humanities and culture, we can offer them nothing but politics. We have long ago ceased asserting that people who attend universities grow in their knowledge of important generalities; rather, we destroy what generalities they believe in at start and attempt to persuade them that there can be no generalities worthy of support and belief. Understandably, they leave the university confused and empty and, eventually, ask themselves if they learned anything at all there. To which the answer is , no, unless they picked up a trade skill.

Anyone who wants to survive this contraction should avoid truth-denying fields at all costs. The teaching is basically false and the world is rejecting it out of hand.


It was elementary school, rather than college, in which I grew in my knowledge of generalities. College is where I learned to critically consider arguments and identify their underlying assumptions, and where I specialized in a particular perspective. The thing that got rammed down my throat is that businesses generally don't want automatons with a trade skill. They want people who can look at a problem from different angles, innovate new approaches, learn new methods, and generate a value that's at least as great as the wage they are paid. To wit, the supposed goal of a liberal arts education is to create careful, flexible thinkers/workers that can meet these expectations.

I'm not sure that a class in Symbolic Anthropology or Queer Theory or Global Issues in Literature doesn't help accomplish this goal. It certainly exposes you to a very different perspective and requires you to decide whether the content is intrinsically or practically valuable. If students don't express skepticism when they experience it, then they have failed in their jobs as learners. If professors don't challenge their students, then they have failed as teachers. And if people want to major or get a PhD in fields further afield, then let them bear the economic consequences.

And politics has always been at the center of higher learning. Always. I think its good that we're now aware of that fact. Ironically, the public's "realization" of this truth is related to the fact that the public is better educated and thinks more critically than it ever has before. The single most valuable contribution of the revolution in the academy in the 1950s and 1960s was the demonstration that how we frame questions, generate tests, and interpret results, all depend crucially on who we are and the stories we tell. Social scientists of all stripes do better science as a direct consequence of realizing and coping with the understanding that man is truly a political animal.


Noble and agreeable sentiments all. But do you think that the contemporary humanities are engaged in a project of merely enhancing everyone's critical thinking? There's a difference in practical terms between telling people to be mindful of their own biases and telling them there are no grand narratives, no progress, no universal truths, no sharable understandings. Its interesting Isaac that you single out the social sciences as having largely gotten this right. I would put it this way: The natural sciences have always had it more or less right, because of their reliance on replicable research methods. Some of the social sciences have gotten it right, being mindful of personal bias while still striving to present findings that others can rely on. This project does, as you say, develop important critical judgment skills. But some of the social sciences, and all of the humanities, have gotten it wrong, dismissing all data as politically driven and encouraging rather than striving against the tendency of academics to pursue personal and the ideological agendas in their work.

I don't necessarily even have an issue with this, as a method, so long as everyone understands what they are getting - an intentionally biased view. Attorneys do it all the time. But in the law, there's always an opponent presenting an intentionally biased opposite view. That's missing in the academy. All these critical thinkers and mindful skeptics seem somehow to always come down on the radical left view of things. Funny how that happens. You don't suppose there's some systematic practical bias at work? That is, you don't suppose that our radical left academic friends are turning personal biases into systematic discrimination in hiring and promotion?? Jonathan Haidt once asked an audience of hundreds of social psychologists how many were libertarian (20) and how many conservative (8). He noted that such numbers, in other contexts, raise howls of protest. A few years ago I saw a UCSD Comm job announcement that bluntly required of the candidate to have a forceful, articulated ideological agenda for their work. I wondered - wouldn't it be fun to apply for the position on the basis of an agenda advocating widespread adherence to the positions of Benedict XVI?

One explanation for the absence of conservative thinking in the humanities and the qualitative social sciences (the lost sciences) is of course that conservative thinking is just plain dumb. Can't rule that one out, just looking at myself. The other is that this business of anti-hegemony and pomo professorizing has acquired a hegemonic, thoroughly modernist/Stalinist hold on important arenas of intellectual discourse.

Now, that may all be well. Perhaps we're getting cadre after cadre of impressive critical thinkers that the outside world craves for their acumen and edge. I am not sure how that could happen, given that an English or anthro major at almost any mainline school is exposed exclusively to one and only one political view throughout their entire education. But let's just say.

If that were the case, it seems to me, smart people on the outside - such as those in Silicon Valley - would be trumpeting the importance of liberal arts education. They'd be urging young people to ignore vocational areas and rather focus on literature, cultural studies, and social theory. Instead, the smart people out there, the forward-looking ones, generally agree that the academy (indeed the entire US education system) is broken, perhaps beyond repair.

If this were not the case, we would not be facing the reckoning that is surely coming. Young academics should indeed be mindful of their biases. They should indeed be critical thinkers. And if they are, they will realize that the political views of the current professoriate are woefully out of touch with the vast majority of reasonable people out there; that they are not by any means being exposed to a wide variety of thoughtful opinions; and finally that if they want to do well in their careers, they would do best to simply learn how to to present credible research findings that are useful to others.


From the contradicting yourself department...This morning I wrote the previous rant about the humanities. Then I had lunch with two professors, one from English and another from History, whose opinions and attitudes completely invalidated just about every complaint I expressed here about the humanities. Good people. So, I don't know what to think, but certainly, I guess it's not as bad as I usually think.


Then again, I just read in the New Criterion this gem from Gayatri Chakravorty Spivak, University Professor at Columbia University.

Begin quote
I have said that the “singular,” as it combats the universal-particular binary opposition, is not an individual, a person, an agent; multiplicity is not multitude. If, however, we are thinking of potential agents, when s/he is not publicly empowered to put aside difference and self-synecdochize to form collectivity, the group will take difference itself as its synecdochic element. Difference slides into “culture,” often indistinguishable from “religion.” And then the institution that provides agency is reproductive heteronormativity (RHN). It is the broadest and oldest global institution. You see now why just writing about women does not solve the problem of the gendered subaltern . . .
End quote


And to tie my previous comment to Edward's thread:

Question #1: Which Indian woman does more to advance the ostensible cause of postcolonial feminism: the one dedicated to doing so, writing obscure screeds in solipsistic cant...or the one busy being CEO of Pepsi?

Question #2: Which is more likely to be hired by a major university to teach undergraduates Important Things About the World?

Question #3 (bonus points): Why aren't the answers to #1 and #2 the same?


If I had to do it all again, I'd just build/make stuff.


There's a demand side and a supply side in the market for professors. So, @Sniffy, the correct answer to 2 is "Clearly not someone who can make millions of dollars as a CEO of Pepsi, since such a person would not apply for a university professorship". Whether one makes a greater or lesser contribution to feminism if kind of a non-sequitur. The question is, who's the better teacher and researcher given market prices? Again, probably not a person who can be the CEO of Pepsi.


@Ted: There is plenty of room to come down on either side of the "do social scientists achieve a reasonable level of objectivity?" question. We could parse that until we're blue in the face (and let's do, at some point). I personally think there are bad apples but most practitioners do try to be objective. But I am skeptical that the humanities - in which I include philosophy, art, literature, foreign languages, and maybe even history - should be held to the same standards and expectations as social science, as you seem to imply. I do not expect my epistemology or my interpretations of Dvorák's 9th symphony to be data-driven or objective. That's not really the role of the humanities.

I'm not sure what part of the intellectual discourse you think conservatives are necessarily excluded from. Could you elaborate? It just seems to me that having the "right answer" in the humanities is historically impossible. For example, the entire set of courses of undergraduate philosophy is understanding the problems with Plato, resolved by Aristotle; the problem with Aristotle, resolved by DesCartes; the problems with Hume, resolved by Kant; and so on. The history of thought and practice in the humanities makes me think that if you believe in a universal truth, chances are there's someone smarter than you who tried to defend that truth,and then someone else who figured out why there's a problem with that defense. If the humanities now reject universal truths, then that would seem to derive from three millennia of successful rejections of claims to universal truth. If conservatism is defined by some set of beliefs in universal truths (God, liberty, limited government, free markets, value of a human life, the right to eat beef), then I can see how a conservative would feel alienated in a course in post-structural philosophy. That's correlation, however; not causation. (I'm not going to even try to explain away the UCLA job announcement, though.)


I agree that the demands we place on the humanities are not the same as those placed on the social sciences. But perhaps I can propose a unified rationale for the differing demands: That the work of all these people be useful to others. In the social sciences, this would mean, offering analysis that is on-point with respect to clients and the public, as well as reasonably objective. In the humanities, this would mean, offering interpretation and guidance that helps a non-expert with three different tasks: 1) judge where to put precious time, 2) develop critical appreciation, and 3) understand the human condition.

To return to the OP, if the humanities were sustained by such criteria today, I would not advise young people to avoid them. If the social sciences stressed objectivity across the board, I wouldn't be advising young people to be careful about that. So really it comes down to an empirical question. First, is my characterization of the fields accurate? Second, are the things I'm talking about the source of the outer world's general disdain for universities?

So, what's the evidence of unhelpful practice in the humanities? As you know, I would offer the exclusion of alternate political views, such as conservatism, as evidence. And you ask, rightly, where's the evidence of that? You list a geneology of experts whose discourse is itself simply leads to a rejection of a favorite conservative idea, universal truth. So, as you say, a course in post-structural philosophy is not friendly to a conservative point of view. And you're on the verge of saying (I'm not sure whether you intend the implication) that the history of thinking is not friendly to it, which is close to saying, it's just wrong. Thus, why worry that smart people don't teach conservative thought?

It blew my mind, as I grew older in the university, how many quite smart people and ideas were left out of the conversations I heard. I read Johnson's Modern Times and was surprised to discover that I'd never heard anything like it, nor encountered anyone espousing that sort of view. There are these moments I remember. Sitting at lunch with colleagues in political science and economics realizing that nobody, not one person, thought that perhaps, just maybe, Rodney King had it coming. Or the number of times over the years when something really quite nasty and stigmatizing about religious thinking or pro-life politics has been asserted (crazy, fanatic, madmen, zealots) on, and this is the kicker, the breezy assumption that of course everyone in the room agrees. Ha-ha-ha. Or, coming to the end of Ken Binmore's two-volume masterpiece on justice, game theory, and the allocation of goods and realizing that nowhere in the books does he consider the possibility that two people might actually deserve different well-being, that is, that our choices have a moral component that has implications for our desserts. Or the time I asked a close relative, a holder of BS, MS, and PhD degrees from laudable secular institutions, whether she had ever heard any professor say anything remotely reminiscent of some right-wing rant I had just unleashed; no. I have seen hundreds of plays at and around universities; dozens of them had a political point; every one of those political points was left-of-center. Not most - every one. I have never heard an academic say "I am opposed to abortion" in polite company. I have however seen an academic (a social scientist) call a doctor a "dick" and start a fistfight because the doctor had the temerity to say that doctors, not the state, should decide which poor people should receive free care. It's moment when I teach the concept of aesthetic judgment and find that my undergraduates have never encountered the possibility that some things in the arts may be better than others, that some voices may actually deserve more of our time, that not all moral arguments are equal. I could go on.

Maybe it's just because I'm persnickety, but gradually the ideological conformity I experienced (socially, you know, over beer and coffee) made me more and more interested in the things all these people were ignoring. This went along with the modernist turn in art, music and literature which, to me anyways, seemed to gradually give over any idea of beauty, compassion and, most important, an orientation to the infinite. Its replacement by post-modernism just exchanged oppressive emptiness for a series of not-funny jokes. Frankly, it all kind of ticked me off. It's one thing to disagree with ideas; it's another to ignore them or laugh at them. Anything being ignored and ridiculed with such universal and constant effort is probably worth a look-see, I thought.

When you look and see, you find a tradition of thought that goes back a very long way from which Descartes and Kant dissented quite possibly erroneously. And yes, this is possible is it not? Is it not possible that 200 years of philosophy has been sadly mistaken? To believe that *because* post-structuralism is active today, that *therefore* it must be better than Aquinas, is to commit one's self both to progressivism in history as well as the economists' bane, panglossianism.

Mucking around in all this, it has struck me that very little being said today is new at all. What academics today take for very settled, very obvious, very true (though not named as such) positions, has been said many times over the past 2,500 years. Nonetheless, the ancient tradition, the old line of thought, persists. The evolutionist in me (as well as the church-goer) thinks this is not an accident.

Think about it. In 500 years, will anyone be a Straussian? Will anyone read Derrida? Now contrast that with Augustine or Maimonides. Think about the sustainability of ideas. Can a consciously biased social science persist? Can a humanities that denies the infinite persist? I don't think so. And this is precisely the moment we are in, it seems to me: The moment when the intellect's dalliance with ideas of nothingness, even though sustained as a community and an institution for a half-century, eventually comes to an end. There's no future that way. Thus the world, ever-busy and filled with people seeking meaning and a quest of some kind, is turning its back.

My advice to the young is simply, tap the world on the shoulder and remind it that some of us, the younger ones, do have things to offer: Guidance, discernment, and hope.


From my Inbox. Good luck!!!

At the end of the month, the Sawyer Seminar on Rupture and Flow will hold a conference on neoliberal logics and institutional engagements with these logics.

The website with all the abstracts of the papers can be found here:

The info on the conference:

Neoliberal Regimes and Institutions of Knowledge Production

April 27-28, 2012
Room 100, Classroom Office Building, 800 East Third St.

Neoliberalism has become a remarkably widespread political and economic perspective, so much so that over the past three decades many institutions have altered their practices to incorporate neoliberal principles. Yet not all institutions are adopting the same neoliberal principles, nor are these institutions all easily or eagerly accommodating neoliberal transformations. By asking how different institutions respond to neoliberalism in institutionally specific ways, we are also following Wendy Brown, David Harvey and Phil Mirowski in taking neoliberalism to be fundamentally distinct from earlier forms of capitalism. This conference will explore the uneven processes of neoliberalization comparatively, focusing on how different institutions respond to neoliberalism. Because neoliberal philosophy and policy places so much emphasis on transforming the ways in which knowledge is owned, produced and circulated, this workshop will focus on institutions that centrally engage with creating, labeling, and circulating knowledge: certification regimes, universities, corporate research parks, courts, and administrative legal regimes. Key questions will include: how have institutional practices surrounding knowledge production, management, and dissemination been reworked in response to neoliberal policies, and what new discourses or institutional logics accompany these changes?


So what's conservative in this discussions seems to be a "belief in the infinite".

You said: "So, as you say, a course in post-structural philosophy is not friendly to a conservative point of view. And you're on the verge of saying (I'm not sure whether you intend the implication) that the history of thinking is not friendly to it, which is close to saying, it's just wrong. Thus, why worry that smart people don't teach conservative thought?"

Hmmm... No I don't think this is what I'm saying, exactly. If the progress of philosophical thought has moved steadily away from the conclusion that "Something infinite exists and is important to know", conservatives may find this offensive. But if philosophers don't study a perspective that is ostensibly conservative, that is incidental. Or if philosophers seem liberal because they do question the existence of the infinite, that is also incidental. The history of thinking is not friendly the infinite. I'm saying that. I'm not saying anything about any other aspects of conservative thought, which don't seem to be be at issue here.

To address your goals of the humanities (and since we're harping on philosophy, I'll stick with it): 1) We generally recognize that the philosopher is an expert in philosophy, so we trust she will direct us to the primary and secondary sources that allows us to make effective use of the time we want to spend learning about philosophy. Therefore, if the philosopher refers me only to the collected works of Aquinas (or Nietzsche or, god forbid, Foucault) and nothing else, she has failed in this goal. 2) If we are referred to a fair assortment of primary and secondary sources, and we reflect on our reading with the guidance of the philosopher, then we develop a critical appreciation for philosophy. If the philosopher does not question our assumptions or subject our conclusions or the conclusions of our sources to critique, she has failed in this goal. I have to say that I've never met a philosopher who's failed in these two goals.

And then there's 3), the human condition. Which I guess in the context of philosophy means asking questions like "Who are we? Where do we come from? What are our limits? What is beautiful? Are we, to paraphrase Yoda, 'luminous beings, not this crude matter'? And how do we know all these things?" One possible set of answers is {Souls temporarily locked in corporeal form; God; there are none; Love; yes; faith} and that might be a conservative position. But it certainly isn't the only conceivable position; it has no claim to being the objectively better position; and if the philosopher has done 1) and 2) correctly, then she will have forced the student to cast doubt on that position. If the student doesn't like feeling doubt, I have no sympathy. The experience of doubt is a pretty important part of the human condition.

I can't find any reason to believe that humanities need the infinite to have a future. Maybe in the Nietzschean, or Hegelian sense: Constantly striving, looking for something better, more beautiful, more correct, a better theory, or a more efficient method. I suppose believing there's always room for improvement is a belief in the infinite. But although people may not be reading Levi-Strauss in 500 years, they will surely be reading Kant (as well as Augustine), and they'll be saying "If a Bible-thumping conservative genius like Kant had a hard time demonstrating the existence of the infinite, that's important information."

So doctors are being hit for being conservative, and communications PhDs have to have agendas to get jobs. I recognize that this is a problem; that it means something more than "Drunk people are stupid" and "liberals can be really annoying, aggravating shills". That doesn't mean little Timmy is being indoctrinated in radical Stalinist ideology in philosophy 101, that he's leaving college feeling empty, or that his experience of the humanities hasn't objectively improved his life or created economic value that is worthy of society's investment.


So we end up with good common sense. I take this last essay to be saying, "Look, the fact is, good people are working on important problems and they're just coming down one a particular side of the issue that you (Ted) don't like. But that doesn't mean there's a disaster in the humanities."

That's fair. I guess we will just have to wait and see.


The ramifications of this arent just limited to academics as a career. The bubble has an effect on the students as well. As we churn out university degrees we diminish their value, much as the value of the high school diploma has been diminished.

A bachelor's degree is the new diploma. Required for even menial secretarial jobs, or a position as a manager of your local Starbucks. The masters is the new bachelors degree, where salaries start showing some improvement, though this is being diluted with masters degrees in candy fields like business administration.

From a teaching standpoint, the bubble contracting will cost teachers their jobs. From a scholastic standpoint, the bubble continues to grow, fueled by the internet. Degrees are easier and easier to get taking what was once a mark of success (a degree of success) is now simply a career necessity to illustrate how much bullshit you are capable of taking as an individual.

Unless of course you study a discipline based in mathematics or physical sciences. Numbers and matter, it seems are the only things that are incorruptible.


Check out this datum. According to the latest salary survey in the game industry, the average salary among "Some College" in all fields is higher than "Bachelor's Degree." For example, for programmers, College Dropout = $105,000, College Grad = $89,000.

The only educational level that exceeds College Dropout is "Some Graduate," that is, Grad School Dropout.

Bubble, bubble, bubble...


Humanities can be next to useless if there is a goal for "truth" rather than a historical study of what people valued and fashions and trends.

I very much enjoyed my art history classes, especially coupled with my ancient literature classes, my classes in philosphy, my classes in historical economics, cultural anthropoplogy and cultural geograph etc.

There is truth in the interelations of things, and that critical thinking of extracting non mathematical , non empiracal truths in terms of corelations is really something special.

To say why might be to debate the meaning of life I suppose, so I'll leave it there.

I might add that economists great failures of late have been in trying to quantify human behaviours such as non financial motivations (social kudos, group behavior, give the man a hammer issues related to economic regulation and academenic overview and semi mathmatical things like agency dilemma, mania etc) . Economics would do well to use subjective "heuristical" tools to create limits based upon history without being able to quantify the "why's" to the heuristics, especially in regard to credit and banking regulation.

How many Loans to developing countries, Savings and Loan scandals, Dot com , Housing bubbles must we have before economists realize that those working for institutions on a salary/commission/venture capital fee basis can not be trusted to have sufficient motivation from the laws of supply and demand to act as stewards maximizing efficiency of the economy yet still have the power to quash compeitiors which would expose their failures before the inefficiencies were so costly.

Reading Pericle's speech in Thucidides really helps bring home some age old truths and a person with a cultural anthropological exposure will assume that human dynamics will largely not evolve that quickly over a shorts few milenium period.


Back to the subject

The health of higher education doesn't mean there must be more people employeed as academics.

I'd say the health of academmics would relate to the number of people educated, more akin to the "health of the agricutural industry" should relate more to how much crop production you have , not to number of workers employeed on farms.

I sort of resent that I can't (or i haven't found) a way to Audit Boalt Law School classes at the University of California Berkely online. Why should there be a limit to how many people can at least watch those lectures at public universities? I understand that auditing is different than being called on to orally recite and interpret cases and or to be held to the rigor of writing which is graded yet if the charge of the university is to spread knowledge it is failing if if unnecessarily creates a barrier and scaricity where none is needed.

With people living so much longer, I'd think that many academic careers could be started in peoples 60s and still last a 20 year period with sharp minds.

How would economics and humanities change if all the professors had lived a "secular" life that they could draw upon before entering full time academics?

The comments to this entry are closed.