Why isn’t ethnography.com more focused on ethnography? Um, ‘cause I don’t feel like it.

I like to use the categories on our homepage to surf through old posts, looking for oldies but goodies to re-post on slow days. I also like to read and think about anthropology and sociology and I can count on finding something here to get my mental juices flowing. And like Mark describes below, I like to think about social science in terms of strategy and innovation. I think that if you want to make it as an anthropologist or sociologist outside of academia, you have to adopt a “broader and more holistic approach” to ethnographic work. A couple of years ago I read an article in The Atlantic titled, “Anthropology Inc.” and it changed the way I thought about doing social science. Click the highlighted link in the previous sentence but make sure you read what Mark has to say below.

Originally published by our founder Mark Dawson in July 2007.

A friend asked me how many people regularly read this blog. Well, not a lot. There is a good reason for this. I have owned the domain ethnography.com for about a decade, as well as several other anthropology related domains. On the other hand, while I am an ethnographer, my professional life is focused on the strategy and innovation, of which ethnography is just one of the tools in my toolbox. This blog is not unlike having a big sign outside your store that says “Motorcycle Repair” and wondering why no one is popping in to order a pizza

If you are looking for information about Kula rings, Margaret Mead, Structuralism and the Yanamamo, let me please point you to Wikipedia.com. For basic social science information, its pretty good. If you want to learn how to make a living an anthropologist, then this is the blog for you!

See, all of these entries are about culture in some way. What draws companies to bring anthropologists into the fold is the belief anthropologists take a broader and more holistic approach to understanding both customers and themselves.

So this bog is about strategy, innovation and people that say interesting things about those topics from an anthropologists point of view. -M.D.


Why I Chose Not to Get a PhD

This was originally published here on ethnography.com in April, 2012. Why did you choose to get a PhD (or not)?

I got to spend some time with a friend recently that decided some time ago to restart her PhD work.  She is already ABD, but is starting the dissertation over from scratch.  My question was “Why?” She is a well-respected professional, and within the her field a PhD will likely be of limited benefit professionally compared to the mountain of work ahead of her, not to mention the expense involved.

In the course of the conversation I was reflecting on my own choice to not get a PhD and thinking that it might provide food for thought for a larger audience. Not to mention the pitfalls of getting to attached to getting one.

When I started my graduate work in anthropology, I had the same expectations as most people: I thought I would wind up teaching at a university or maybe in some kind of think tank. Rather than going directly into a PhD program (I already had a B.A and MS.Ed in other areas), I chose what was then a terminal master’s program at the University of South Carolina. I thought at the time, doing a MA first would enable me to get into a better PhD program, in reality I don’t think it makes a difference either way. Future graduate students should also take note USC now has a full-fledged PhD program that started several years after I finished my M.A.  Well, as time went by, my interests and goals evolved.  Not an unexpected thing to happen as you spend a couple of years learning about the in’s and out’s of a discipline. Looking back, I believe that one of the most significant course changes was when I decided that I was more interested in applied work rather than working in academia. I won’t mince words, once it got around the department that I was not planning on pursuing an academic path, it felt like I was pretty much dropped like a hot rock as far as most of the professors were concerned.  One professor [to remain nameless] didn’t mince words either, she told me flatly that any student that was not planning on a professional academic career as an anthropologist should not expect any interest on the part of the instructors. My thesis advisers promptly dropped any interest in my thesis work as well, and it shows. Before you think it was awful, I am talking about significant small moments in time that occurred during my grad work, not the entire school experience. I got an excellent education, I had some great instructors and I would go back to South Carolina again.  At that time, quite simply, applied anthropology as looked down on as well as only getting an MA. Things have become considerably more enlightened in the discipline overall since then.

Compounding the issue of being primarily interested in applied work, my research interests in two divergent areas were not seen as worthy of anthropology: One was the area of intentional violence. My graduate thesis was based on intensive research with a prison population, and that evolved into interest in two areas: terrorism on the one hand and serial homicide on that other. Both of which I was curious to see if they could be studied almost as a cultural language or the semiology of the acts. The second was in a totally different area; due to my long-standing technology interests (I had always put myself through school as a computer jock) I was becoming much more interested in the intersection of culture and technology. It turns out that the latter interest would serve me very well later in ways I never imagined.

But given all that, I STILL wanted that PhD.  Why?Well, as it has and had for so many others it became for me the difference between success and failure.  I was $150,000 in debt and looking at more, I had years of education behind me and more to go.  To me, getting those three little letters was the difference between being a legitimate scholarly person and a nobody.  I got so nutty about it that I wouldn’t even date someone that was not getting some advanced degree (That stupid arrogance likely cost me some excellent relationships.). A PhD was a ticket too studying the topics I wanted, a life of scholarship and (the applied part) once I got the ticket, I would be able to pursue applied endeavors at will.  Yes, I was indeed blind to how the life of a university professor really looks.

So what happened? Shatteringly, but in reality lucky for me in the long run, I did not get into my first two choices for a PhD program, but was accepted to the applied PhD program at the University of South Florida. Given my interests were then more fringe topics, there was no one there that was doing work even remotely related and I was concerned I would be suffering from a real lack of mentor-ship.  Also, the connections you make in your PhD program can be very important when job hunting, having dissertation advisers that can make introductions later was a concern.

Then, the proverbial last straws. I went to a AAA meeting and on the job board were four or five lonely looking position announcements for very low paying positions (as they usually are), seeking scholars of a few countries in Africa. The next factor was watching from a distance as the USC anthropology department was fielding applications for a new position. There were not dozens of applications – there were hundreds, and from people with long publishing histories, all from the top tier programs at the time.

I realized quickly after that I could not justify continuing on with more graduate school. The math was fairly stark: Endure additional crushing debt load, to take that fairly small chance that I might get the job I want, at a salary that would barely cover my debt, rent and food, in an environment that I really didn’t like all that much.

Understand, I was never much for the publish or perish game, or the nasty politics that can emerge in academic departments, so I was ill suited to the profession anyway.  But that is not the reality I was thinking about at the time. I remember the moment I knew I was going to quit pursuing the quest for a PhD.  It was devastating.  I called up a friend of mine that had made the same choice after going ABD and bawled my eyes out.  “It has all been a complete waste,” I told her, “All the years, all the work, all the money has all been flushed down a toilet and I have nothing to show for it.” I don’t remember what she said to be honest.  I am sure it was supportive and reassuring and none of what I was thinking was true.

I can tell you this much: all of the thoughts I had about not getting my PhD equaling failure were and are utter bullshit. Why do I say that? Here is what happened once my head cleared, I got the emotional cobwebs out and started to assess what I wanted to do.

I wanted to keep studying culture, I wanted to be involved in technology and I wanted to get my hands dirty using anthropology to actually do something. First I got a job working full-time at the university as a computer jock, and I started by regaining my life: I got involved in the local old-time and Irish music scene in the area, I made friends that had nothing to do with anthropology, I worked with a friend leading canoe trips on the local river and started rock climbing and generally having a pretty happy life.

And I also did research, lots of research into the life I wanted. I scanned journals and periodicals, professional trade journals looking for any connections of people working in anthropology or social science and technical fields.  Design Anthropology was in its infancy then, and I was lucky enough to find an article about some anthropologists combining anthropology and technology skills to help companies develop new products. Then by coincidence, another graduate student appeared in my office and showed me an article about the very same company and said “I think I found your job.” She was right of course, after that it was just about the job hunt (another long post). Was all my education and training a waste? Hardly. I was a trained anthropologist, with extensive technical expertise, had years of experience watching how people interact with technology, and had a couple of years’ experience in a consulting environment from my previous graduate degree. Those were all qualifications people were looking for. Once I cracked the code of what I wanted to do, and where it was valued, I was fielding multiple offers precisely due to all the effort I initially thought I had wasted by not getting the PhD.

For me, it was far and away the best choice then and is now. I have had a great career, multiple actually, and for all of them that MA in anthropology has been a major factor in my getting those positions. At this point, I really don’t have a personal or professional need for a PhD, and a vanity PhD seems like a waste of everyone’s time on already strained university budgets.

So, that’s why I didn’t get a PhD.

Applying Rolling Cohort Analysis to Unstable countries

A few year ago, I was working at Kodak and friend and I were talking about the idea of Rolling Segmentation.  More recently, we have been talking about how it relates to how that thinking can be applied to issues of instability and insurgency.

I  have not given the idea much thought for the last few years, and on rethinking about rolling segmentations/cohorts now, I have more questions than answers.  Not the least of which is how is it all that different from any other longitudinal study?  There are a number of them out there about how people’s political attitudes shift as they age, but I would be leery of trying to extrapolate them between different cultures.  When I go back to the beginning of the idea, it popped up at Kodak as an attempt to explain why over time technology seemingly was adopted by older segments that had not adopted it before. My theory being that if you looked at the adoption trends over years, you would find that a lot of the uptake was in fact that the 20-30 segment had simply aged into the 30-40 segment giving the illusion of older populations adopting the tech. The 20-30 year olds simply drug the technology with them forward in time.  That does not explain the entire uptake of course, there will always be early adopters across all ages at any given time. It also suggest waves of adoption, when in some cases, I am betting is more of a moving wall for certain types of adoption: Music, is in waves: each generation or so has its own music, those styles routinely build on each other, each style waxes and wans in relative popularity.  Tech on the other hand is more of a wall: some early adopters pick it up, and if it is a viable long-term value to people, meets some need AND nothing superior comes along, the succeeding generations will pick it up. In some areas of technology, we have seen more rapid flows up the age chain from younger early adopters to older: think ipods/mp3 players from portable CD players. Thinking about it, maybe it is frequency and overlap. Some waves have  a longer life than others for various reasons.

The question is not then the difference between Rolling Cohort Analysis (RCA) and other longitudinal studies (because RCA is by nature longitudinal), but more of a lens by which to question the model of “segmentations” as usually found in business analysis. It is a warning bell that looking at segments only in a single moment is an artificial and inaccurate representation of the possible future. In turn can it be used to better judge the size of the current market and more importantly, predict the size of future markets? Segmentations as usually presented are static representations of the market.  A simple snapshot of the world as it stands.  There are studies that look at how various segments are growing, i.e. predications of the size of the Latino population in the US in the future, or the effects of birth rates on other countries, etc and people do try to understand how to do future planning and such from that. What I am getting at is RCA is more a way to look at something rather than a method itself.  The ultimate question being: as an age cohort (something that requires definition as well for each study) progress through life stages in a particular culture, what evolves and how?  What stays stable? But companies of course, don’t want to age with their customer base: As an example, take the questionable “most coveted 18-35 year old” demographic. Practically a catch phrase at this point.  As Nike customers age, they don’t want to stop appealing to that lower age group. I live in a building with that problem, it is seen as an elderly “old money” condo building, and the resident population is aging out, i.e. dying. Nike, my building, the Army, Apple and the rest all need to absorb people at early age groups to get them to carry the technology/brand message/ideology forward.  Sure they want customers at all segments, but a customer with a potential 50 years of lifetime purchasing and influence is more valuable than a customer with 10 years left of life time purchasing.  You and I are of shorter time value to companies, and our 30 year old colleagues are of higher value.

Back to RCA: It is just another way of talking about, but intended to highlight, the dynamic nature of and potential acceleration of concepts over time. It is a mutli-modal analysis that looks at intersections of multiple kinds of adoptions in an effort to suggest directionality.  Can it also be used to predict disruptions? The CD/iPod could be an example: CD’s killed records in pretty short order, but CD’s really had a pretty limited lifespan all things considered.  I would be willing to bet that if you looked at the uptake rates of MP3 players and iPods among all age groups that previously owned portable CD players, you would find a more flattened adoption pattern across ages than something like Personal Computers. Why? Well, convenience can’t be discounted, but we also have a population among a much wider age group that was comfortable with computers, and had access to high speed internet both of which are critical factors to adoption of portable digital media.

So that’s a long walk to get to asking how RCA is of value when looking at questions of unrest and large scale behavior among populations. Without significant access to multiple variables, the answer is “beats me!” We still don’t have a great understanding of the socio-cultural dynamics of places like Afghanistan, Iraq, Iran (Just to name three in the news frequently). These are places that are in ways culturally designed to be opaque to outsiders. As much as religion has been a theme of liberal vs. conservatives in debates over the last decade in the US and this upcoming election, we still don’t have a clue what it means to live in a society where someone’s interpretation of religion permeates every aspect of life, and that person holds considerable authority. Nepotism in hiring is not corruption, it is meeting obligations to family. Denying anything to outsiders is normal, it is not outsiders business. What we call a lie, is not a lie in another context and if you came of age in the Saddam era, you learned a lot about the value in keeping your mouth shut about and around authority figures.  In the US we just don’t understand or appreciate the overwhelming power an authority can gain by cutting people off or severely reducing assess to news sources outside of the regime. Sure we get up in arms about keeping and free and open internet, but we as a country don’t have a first-hand clue about the power of state run media to completely shape reality. Sure our government, corporations, political parties have spin control, but that’s nothing compared to the totalitarian control found in other places.

Maybe RCA is simply more difficult (not impossible) to apply to unstable countries. In the US, we expect a certain level of rebellion by younger generations.  In the US, it is an expected part of the process that younger people are going to dress, speak, have music, use technology and more in ways that surprise/offend/concern previous generations.  We encourage this behavior by making words like “follower” or “sheep” negative connotations for someone we see as not having the  intelligence/brains/imagination to take risks in creating a unique identity.  We don’t have family fealty, we have a “mommas boy” (excuse the dated language) and “boomerang kids” for people that don’t or wont break with family to create their own world. (Which by the way, I was on a train in India and one older man pronounced that our emphasis on kids leaving to create their own homes was the proof that Americans don’t love their family’s as much as the rest of the world, a very interesting perspective).

In unstable countries, displays of self can potentially have more serious impact on the individual and family level. Americans just don’t get the power of perceived “shame” and “honor” and how it can be used in all kinds of ways to manipulate and mold people. RCA is in some ways the tracking of “displays of self” as they evolve over time.  Perhaps the more stable the culture, then the more regular the “Beats of the waves” are internally to a cohort as compared to other countries. I think the question about unstable countries in regards to RCA is how much of it is X steps forward and X steps back, and why? Or in more stark terms, in the US when an age cohort(s) starts protesting how the nation is operating, the US government does not disappear them. Yes, we indeed have over-zealous and possibly illegal reactions in some cities to the occupy wall-street movement, but it is doubtful that any of them are being hauled off to prison for years, being tortured just for the sake of it and possibly vanished along with family and friends.  You can’t have a rolling cohort when the government is rolling the bodies of the cohort into a ditch. Limits on speech, limits on access to outside media, reprisals to movements that evolve stable countries all could contribute to severely attenuating how influential a cohort is as they age. At some point most people will just opt to survive.

Undergrad Seminar: Time Management

Here we are in the 2nd half of the academic year. If the 1st half got off to a rocky start, maybe this is a good time to talk about time management. Not the “The 7 habits of that smugly overambitious go-getter” variety. This is aimed more at the “How can I squeeze school into my hectic schedule of procrastination and binge drinking” style. In other words, for the rest of us. This is not to ignore what I think is the real value of the university experience: the freedom to explore, to question, to learn what you never expected. If you go though school without some kind of an “Ah ha” moment, then you have to ask if you really took advantage of the opportunity. Time management is making sure you have the ability to explore those Ah Ha moments.

What does time management mean? It is simply developing a strategy that helps you set reachable and realistic goals that treats school as something akin to a job. School is not the same as a job, I know that. In the US, heading off to college represents all kinds of milestones and transitions towards adulthood including making a lot of really stupid mistakes. Since stupid mistakes are part of life, you may as well factor this in and manage the parts you can. But if you can put yourself into the mindset that school IS your fulltime job, it might help with things like procrastination (my all time largest problem in school). That part-time job you have in the library, or as a teaching assistant or else-where are something you have to do to make ends meet, but school is your fulltime job. (This is referring to fulltime students. Part time students are often already fighting a massive time management battle).

In addition to getting those “Ah Ha” moments that we all love, there are some very basic tangible goals you want to hit: Graduate in 4 years, 5 at the outside with the GPA, experiences, training and recommendations you need to take you next step, no matter what that may be. School is about more than the GPA and getting out, but school is also expensive and your GPA at the end matters, so it is in your best interest to keep that in the back of your mind.

First rule: Incompletes are bad debt. Very Bad Debt. No matter what else you take away from here, learn that taking an Incomplete at the end of a class should be seen as a last option. You would be amazed at how often someone’s college career gets derailed due to piling up incompletes. No, your instructor will not take pity on you because its 5 days to graduation and that one incomplete is in your way. When you have an incomplete, you have very little room to negotiate. You don’t even have the option to take a lower grade if the instructor decides you have to finish that paper or project to complete the course. Never take an incomplete? Well, that’s strategy isn’t it? It’s much better than an F or D or maybe a C, but if it is a class outside your major and you really don’t want to spend more time on it, would you rather have the B or the bad debt of an incomplete that can become an F? I once knew someone that took an incomplete to get an A+ instead of an A, maybe I am a slacker, but that is insane given how much riskier the Incomplete is. Also instructors talk, if people find out you are taking several incompletes, they are going to stop giving you that option. Remember that taking the Incomplete is not your choice, it is your instructors. They have no obligation to give you one because its it bad debt for them as well! They have to give you a grade, chase you down before it becomes an F and listen to your excuse because you keep putting off that paper or project you owe them. If you are piling up incompletes, you may need to lay out a semester just to get them off the plate. Having an incomplete is mentally the same as carrying over that (or those) class(es) into your next course load.

Oh hell, you already have an incomplete? Weren’t you just reading all that… ok, ok, fine. I’ll calm down. Either you have screwed up badly or some legitimate misfortune befell you at the last part of the semester. All we can do now is move forward. That incomplete is a big pile of rotting food in your kitchen and you have GOT to clean that up before it gets into the rest of the food and really stinks up the whole house. To start with, there is no easy solution that will not increase you workload unless you have some miracle deal with the instructor. You cannot “borrow time” from your existing work load. If you take that attitude you are looking at a domino effect of incompletes. Is it starting to sink in why this Incomplete of yours is a big friggin deal?

There is only one way out of this: give up your free time to finish the job. That it, the only solution.

You can’t take the time from the work you already have to do, like the 500 pages of reading you were assigned over the weekend that you weren’t going to do anyway. I KNOW how hard this is, I am a terrible procrastinator and we are the worse kind of people to have incompletes because the deadline is often vaguely out there, but not quite real. The longer you take, the better the final product is expected to be! Maybe this is one of those “screw it, I will do a little worse work and take a B for the paper” moments on this particular project. But you have to turn in something or risk getting a failing grade. I am not going to even say you are going to feel better getting it off your plate. Having to finish this Incomplete is going to put you behind on your other work that you will have to double up on to prevent it from going incomplete. By the way, if we are talking about a 10 page double spaced paper please don’t write and tell me. I will run screaming from the room. This blog entry is nearly four pages double spaced using Arial 10 point font. 10 pages is really not that big a deal.

Make a plan, set a drop dead date and make your idea realistic: What is the minimum you have to do to get the grade you want. My apologies to my faculty friends, but this is triage and the crass reality of it. Your goal is not to win the undergraduate award for writing, it’s to get the incomplete off you plate. Scale back as much as you can: do you really need 40 sources or will 10 do? Is the instructor looking for regurgitation of their pet ideas or original thought on your part? Being that challenging student during the class is great. But now it’s an incomplete, a pain in the ass and not the time to get clever. Have you got a draft? Great, drop it off at the professors office. You might not get comments, but it shows a good faith effort on your part towards meeting your commitment. If they do comment, you might lucky and they say “hey, if you just add a paragraph about X, we are good to go.” And please dear Lord, don’t drop off an idea they already rejected and this is that same dumbass, irrelevant, unrealistic idea that you stubbornly hung on to and got you that incomplete in the first place. LET IT GO. I have watched people do that very thing. I don’t know what insanity overtakes them, but for the love of Pete, knock that crap off.
Do that incomplete: Do it this weekend, do it over two weekends if you have to. Unless that paper is huge, two hard weekends can cover it.

When do IEDs quit being IEDs? Why are we still treating insurgent munitions as folk arts?

I started as a university student, I was studying folklore and material culture, and IED’s certainly qualify as material culture that has the potential to tell you something beyond basic forensics. In this entry I am looking at IEDs using the language of business, innovation and ___________ ? The idea is to see what insights can be gained from thinking about IEDs outside of the military language.

We still read about IED’s in the popular media as if they are a folk art or the equal of some kind of primitive booby-trap. But people have been making these things for a lot of years, so that’s a lot of improvising and in turn I am suggesting what must be a fair bit of standardizing. (As a note, this entire post is based on this premise. If the premise is not true, well… what do you want for free?) Agreed, from a threat perspective, it matters little the sophistication at the point of use or to those injured and killed by them. The effects of even crude devices are well known. But at what point do you go from thinking of them as completely improvised weapons of opportunity and start thinking of them as standardized weapons and part of an formal overall weapons system? To continue thinking of “IED’s” as “improvised” belies the underlying increasing sophistication from which I suspect they come into being.

If you want to know what I mean by “standardized munitions,” go into any outdoor store that sells hunting gear and you will see standardized weapons and ammunition. They are made in large factories with relatively strict controls of production, quality, distribution and sales. While there are multiple value and supply chains these weapons travel (for example, those destined exclusively for military use vs. something available to a private citizen), when the system works properly and the applicable laws are observed, these weapons can be tracked from manufacture to final distribution. They have a path they follow from factory to the dealer to the consumer (be it person or state).

However, what I am questioning is does the difference between an improvised device and standardized device just boil down to: if all the components are collected, assembled and distributed from a single point OR if individual components are distributed from multiple points and then assembled at or near the point of use? Indeed, too jump to the punch line, seems the main difference between them if they are state approved and regulated, they are legitimate munitions.  If it is not state approved or regulated, it is an improvised monition.

At this stage of IED development, “improvised” speaks more to a production process than lack of standardization. We are really speaking to the multiplicity of possible components that can be used, the non-standard nature of the distribution channel, the point at which the components come together and the lack of state approval or regulation. It is important to tease these minor points apart because those elements that we use to define them as “improvised,” are in fact the major strengths insurgencies seem to standardize munitions around.

In the US people hear about insurgents making explosives in an ad hoc fashion like some kind of hillbilly explosive or bathtub gin. If you keep up with the news, you know that is not true. They have become increasingly sophisticated and we are no longer just dealing with fertilizer and oil. While there may be a multiplicity of components that make up IED’s, I am suggesting that there has been developed a standardization of production principles that allow for multiple production methods.  In fact, it can be suggested that one of the strengths of insurgencies in asymmetric warfare is not the in diversity of the product portfolio (IEDs, EFPs, etc) but the diversity of the production methods for their portfolio of products, the munitions. This diversity of this production allows of a set of specific principles or rules to be set in place that can be applied across a variety of situations on a localized basis.  It is as if McDonalds supplied the basic plans for the menu, the marketing and occasionally advising, but the franchisee could purchase stock locally or from the national distributor, depending on what worked best in that market. They lose the classic buying power you get with an economy of scale, but it also gives the insurgency much more flexibility in the system so they don’t have to worry about centralized shortages.

More mechanically complex weapons systems, from a hand guns to a warships, depend on strict manufacturing standards with little to no tolerance in variation. IEDs generally can have fairly wide tolerances in variation between components.  Multiple power sources can be utilized, trigger mechanisms can be as complex or as simplified as needed. While some of the components can be complex to manufacture, there are number of variations of each component that can be mixed and matched to create a completed munition. This high level of variability is enabled by focusing on diversity of production methods as opposed to diversity of product that keeps a certain amount of slack in the IED supply chain. If the source of one component runs out, the high tolerance for variation means that a component with similar characteristics can fill in the gap.

There is one more issue that the diversity of production methods provides an insurgency in this context: A very high return on investment (ROI). In the most simple terms, an IED that costs $200 or so dollars to create can force a standard military to spend millions of dollars in attempts to create technical means to defeat it. Using the diversity of production method principle, an insurgency has the ability to react to technical defeat solution much faster than those defeat solutions can be created.

EPIC 2012 Call for Submissions

If you have never been to EPIC or submitted for it, it is a great conference for anthropologists, engineer, designers, government types, or anyone else interested in how to bring user experience into the decision making process.

Here is the notice:

You can access it on the website here: http://epiconference.com/2012/
To join the EPIC mailing list, go here: http://eepurl.com/hSU-c

We also recently added some information about the conference hotel and wonderful Savannah to whet your appetite & help you plan ahead. Please tell us what else you need to know – info@epiconference.com

Call for Submissions

The Program Committee of the Ethnographic Praxis in Industry Conference is pleased to announce that the submissions process for the 2012 Annual Conference is now open.

This year’s theme is Renewal. After three years of economic recession, a year of political ferment and the rise of the global Occupy movement, it is hard not to conclude that renewal is currently part of the zeitgeist. This opens up questions for the EPIC community: what’s our role in renewal and how and why might we renew ourselves?

We welcome submissions for Papers, Workshops, Artifacts, Pecha Kucha sessions that address this theme. We also invite graduate students to submit proposals for inclusion at the Doctoral Colloquium. Comprehensive details of the Call are available on the EPIC 2012 conference website and below you’ll find some basic but important information about the submission process.

EPIC this year will be held at SCAD in Savannah, Georgia, from October 14-17.

Submission summary

Submission deadline dates are as follows:

Papers: 13th April
Workshops: 20th April
Artifacts: 27th April
Pecha Kucha: 4th May
Doctoral Colloquium: 4th May

American Anthropological Association dissolves, decides to start over tomorrow.

APG Newswire WASHINGTON, D.C. – The American Anthropological Association (AAA) made the announcement today that its Joint Committee for Publishing and Employment Services unanimously recommended the immediate dissolution of the AAA, stating there was nothing left to study.

James Curry, the newly-past President of the now defunct AAA, stated the organization had no choice. “Look, it’s all been done. All of it. We have talked to every god forsaken group on the planet, and there is nothing left to study.” “Frankly there is not even a job market out there for students.” Increasingly graduate students of these former anthropology programs have found themselves with little to do even when trying to complete their dissertations, much less do meaningful publishing. John Gault from Indiana University talks about hardships in the field: “I originally wanted to work with the Tsohon-djapa tribe living in the Javari region of Brazil. Turns out the F’ing Discovery Channel gave one of the kids there an HD webcam that runs 24/7. Now my dissertation is on some group of freaks outside of town that worship an old incandescent light bulb with a grease smudge that appears to be the image of Jesus. This blows”

To hasten the demise of the former organization, the AAA is recommending the destruction of all books, letters, monographs, white papers, dissertations and even master’s thesis work in the former field of Cultural Anthropology. The committee began by burning the minutes of their own meetings along with the abstracts and agendas of every meeting and conference the AAA has even been a part of.

Foster Kerry, the head of the committee was thrilled with the move.  “I am very excited for this new untouched field. Just imagine all of those utterly primitive cultures out there, such as Ireland, we know nothing about. With the advent of transportation like the steamship and the auto-mobile we have access to so many other places. Up to this point what we know about these primitive peoples are from the writings of missionaries. 2010 looks to be a great year for this new field of study.”

Not everyone is so pleased Martin Cost, a full professor at Walknut University has serious concerns about the announcement. “What the HELL, what the hell does this do to my Tenure!?” was the first official statement from Dr. Cost when informed of the move by APG reporters. “I am not doing that fieldwork crap again, no way.  My whole career has vanished.” APG asked one of Dr. Cost’s graduate students to comment on the potential destruction of most tenured faculty members careers, including Dr. Cost. That graduate student stated “BAHAHAHAHAHA!  HAHAHAHAH! HAHAHAHAHHAHA!”

Dr. Curry has some understanding for the concern.  “Look its true; teaching positions, publishing, tenure, sex with natives before any ethics are laid out, are totally up for grabs at this point. Right now we have a lot of High School PE teachers filling in at their local colleges and universities teaching “health studies” until some real research gets underway.  We expect this to be a banner year for grants, people love to fund new fields of study.”

An ad-hoc committee has already been formed to discuss what to name this new field and set-up a professional organization. It is likely to focus on documenting the ways the simple, primitive, innocent folk lived before we were corrupted by modern conveniences.  A overall “Study of Man” if you will.

Librarians nationwide also hailed the move for freeing up an enormous amount of space in the countries libraries which is now expected to be used for coffee and pastry kiosks.