Sunday, August 12, 2012

Human Enhancement as Fantasy Projection

Machines “project” as the phenomenologists say or, as the techno-theorists put it, they “extend” our human senses and our consciousness. The addictive and phantom effects of the internet have everything to do with this. Using this same phenomenological reading of technology, trans- and post-humanists are fond of speaking of human “enhancement.”  

But a phenomenological analysis of technology would remind us that the augmentation in question is more attuned to the machine than it is (or can be -- and this is in spite of the detours that Latour and actor-network theorists rightly emphasize) cut to human measure. 
Bruno Latour
 It is a reflection of this very attunement that, to speak as the ethnographers and sociologists who study this phenomenon, we are “machine-obedient.” 

Martin Kusch

Harry Collins

Nor are we as mechanically tractable or responsive as we are because we wish to be — because we love our machines, erotically, affectively, as Latour suggests that we do[1] or else as Donna Haraway has also argued (albeit in another way),[2] but and quite simply inasmuch as we have to be machine-obedient simply to use our machines in the first and last place. 
This is true from our autos to our computers and cell phones and cameras, indeed and even Facebook and so on. 


And here there is a network-actor loop (or loophole) at work: for it turns out that the greater our obedience, the more we comply, the better technologically attuned or, just to show our easy familiarity, the better techno-geeked out we are, the “better” the machine “obeys” our every whim.  
Christian Bales Batman survives the end of Christopher Nolans The Dark Knight Rises not just because of his technology, from his batsuit and its associated gadgets to his humvee (with wings) or tank transporter in the sky, but because of his intimate, second nature, or automatic (done weeks ago) coding prowess.
 
Bruce Wayne not only zen-wills himself out of a physical breakdown, with the aid of chiropractic, a ‘science’ traditionally derided by modern medicine but widely known for its efficacy, but through transcendence of will, he rises, learning fear, becoming like a child, out of the cave of his Jodhpur prison (this ascent is the meaning of the title).
But, and this is the films great Lacanian secret, this is the heart, this is the rule of the Symbolic Order, Bales Batman in addition to being Batman and having all Batmans resources (that would be comic book fantasy oodles of money colliding with real-live OWS corruption and the associated economic implosions of the same, and that would also be the star power of Morgan Freeman and the always excellent Michael Caine) also writes software like no one else can. 
And he does it on the fly, theres that deus ex machina of movie time, a full six weeks ago, always already. 
It is this that saves him -- precisely by saving the girl, who, shades of Pretty Woman, saves him right back.

Ah, equality. 



Notes


[1] Latour, Aramis or the Love of Technology (Cambridge: Harvard University Press, 1996).
[2] Donna Haraway, Modest Witness@Second Millennium: Female_Man©_Meets_OncomouseTM, London & New York, Routledge 1997 but see also her “A Cyborg Manifesto,” taken from her Simians, Cyborgs and Women: The Reinvention of Nature (New York: Routledge, 1991), chapter 8, pp.  149-181. Haraway’s “A Cyborg Manifesto” (http://www.stanford.edu/dept/HPS/Haraway/CyborgManifesto.html) is an online dissemination of this chapter that works if anything has worked, through the erotic, feminist fantasy cover designed by Haraway and the painter, whose work was originally not attributed, Lynn Randolph. 
As the artist describes her own picture. 
“I placed my human-computer / artist / writer / shaman / scientist in the center and on the horizon line of a new canvas. … A giant keyboard sits in front of her and her hands are poised to play with the cosmos, words, games, images, and unlimited interactions and activities. She can do anything.”  
Donna Haraway
With all the power lent by the imaginary, Haraway remains the go-to reference for writings on the posthuman or human-cyborg techno-hybrid. See for one example, just for a start, Haraway’s interview with Hari Kunzru, “You Are Cyborg,” Wired Magazine, 5/2 (1997): 1-7.

Wednesday, August 1, 2012

Nietzsche: On "getting oneself a culture"

Nietzsche’s own reflections on
 what is needed for an “education” as such are quite formidable—even as his own education was an extraordinary one. To this extent, we betray something of the limitations of our own formation whenever we find ourselves insisting that Nietzsche took or borrowed  his ideas from other thinkers—what does it mean (and this will be the point here) to take or borrow” an idea?ranging from Pascal and Spinoza or else Spir and Lange or Emerson, or Gerber, or Stirner or ultimately and of course, from Wagner himself (especially for the Wagnerians for whom no limit to the master’s own cultural prowess can be imagined).  

I am not saying that Nietzsche was not familiar with these thinkers, far from it. I am saying that an education is this familiarity and much, much more. Thus although it is amusing to note that the identity of the supposed origination of (the so-called ‘sources’ for) Nietzsche’s ideas just happens to change in the scholarly literature over time (and not less with the mood and, nota bene!, educational formation of his commentators), it is also noteworthy that the very same set of assumptions applies (negatively speaking) for those who are fond of insisting that Nietzsche could never have read Kant (just to pick one contentious example, contentious given the influence of Kant on the 19th century, an influence we fail to see in the 20th as in the 21st century, at least so far).
The idea that an education, the getting of or the having of one, is a simple affair, and thus that the parallel idea of an upgrade to the more-than-human, that is now: the trans-human would simply be like taking a course, signing up for an instructive module, supposes that one pretend, (as transhumanists do like to pretend) that one can/should set aside questions of cultural inequalities, differences in wealth, “class” differences and so on. In this (an sich inherently optimistic when it is not calculating when it is not deliberately mendacious) regard, the transhumanist movement may be revealed as a humanism, here using the term as Sartre once spoke of Existentialism as a Humanism.
By contrast Heidegger’s “Humanismusbrief” is written against such a presupposition. See Sartre’s L’existentialism est un humanisme and compare both with Sloterdijk’s controversial Elmau lecture: Regeln für den Menschenpark. 
Hence and at least in principle, human enhancement may be regarded, if only for the sake of argument, as corresponding to “enhancement for all,” like “micro-chips for all,” or “airport security searches for all.”  

Monday, July 30, 2012

Transhumanism as the New Future of our Educational Institutions

Many scholars, and nearly all tech related advertising, assume a chirpily upbeat, focus on technology and how it is changing the world, transforming us: the transhuman is the human plus (whatever) technological enhancement.   
As a specific, Stefan Sorger is one scholars who raises the issue of Nietzsche and evolution, an issue that is itself far from straightforward (most readings of Nietzsche and evolution depend upon a fairly limited understanding of Darwin and, not less, a fairly limited understanding of Nietzsche’s own understanding of Darwin). 
We can hardly raise all the relevant questions 
that remain to be explored on the (very, very) complicated theme of Nietzsche and Darwin, but the key issue seems to be the (may we say mildly Lamarkian?) parallel Sorgner constructs between education and genetic enhancement. Nevertheless, and his argument depends on this, education and genetic enhancement are “structurally analogous procedures.”  
This is worth noting as arguments in favor of technogically sophisticated enhancements in a many arenas similarly depend upon such analogies.[1]  
But what is “education”?   
Shall we understand this in the traditional sense of Bildung or as what counts for the French as formation and where we may speak of either in terms of what Nietzsche also called getting oneself a culture, that is: personal and intellectual cultivation?


Or, and now apart from these traditional  meanings, will an “education” correspond to nothing more than the business (emphasis on the economic or cost-based affair) of acquiring and conferring, i.e., obtaining and selling degrees and certificates — all like such modules, courses, degrees, parallel to many add-ons and upgrades, like iPhone or android apps and the enormous market that there is for cell-phone accessories which same pale in comparison to the market for iPad accessories, Apple and otherwise? 
And yet, it may be that this surface parallel calls for a bit more reflection, especially with regard to Nietzsche who himself reflected quite a bit on educational institutions as well as the idea of education—even if we begin with his very paradoxical, very provocative claim: “There are no educators” [Es gibt keine Erzieher] (HH II, The Wanderer and his Shadow § 267).  
What is certain is that many of us even within the academy do tend to suppose that education is just and only the acquisition of such degrees, especially at the graduate but also at the undergraduate level, and especially as evident in the current debate in England and mainland Europe on the virtues of the privatization of the university—a debate which manages to overlook any review of the actual practice of the same as this can be found in the US.
European advocates focus on Princeton, or Yale, or Harvard, somehow managing to piss paying attention to  the hundreds of thousands of tuition-driven, for-pay or profit institutions as these abound at every level of post-secondary education in the United States. As for me, I’d compare CUNY or SUNY or the University of California system to private schools, even top tier schools, any day—if not of course when it comes to prestige as that is a market and class affair, but indeed and when it comes to education.  Nor would I be the only one.  The more critical point however is indeed that European fantasies about private schools tend to suppose that all private schools work like top tier schools. For a discussion, see Babich, “Educationand Exemplars: Learning to Doubt the Overman” (but I also recommend the other contributions in) Paul Fairfield, ed., Education,Dialogue and Hermeneutics (London: Continuum, 2011), pp. 125-149.

No need for factual feedback to sully our models, as Orrin Pilkey, a very practical or applied or hands-on coastal scientist has argued with stunning consequentiality when it comes to beach erosion and the public costs of “maintaining” the same and with very specific meteorological applicability to the debates on global warming.
See Orrin H. Pilkey and Linda Pilkey-Jarvis, Useless Arithmetic:Why Environmental Scientists Can’t Predict the Future (New York: Columbia University Press, 2007) as well as Pilkey’s new Global Climate Change: A Primer.  And see his very practical, timely editorial: “The road ahead on the Outer Banks,” Newsobserver.com, Sat., Oct 08, 2011. I discuss Pilkey’s analysis of modeling further in Babich, “Towards a Critical Philosophyof Science: Continental Beginnings and Bugbears, Whigs and Waterbears,” International Journal of the Philosophy of Science, Vol. 24, No. 4 (December 2010): 343-391.

I.e., no empiricism, please: we’re idealists cum speculative realists.


 
 

[1] Sorgner, “Beyond Humanism: Reflections on Trans- and Posthumanism,” Journal of Evolution and Technology, Vol. 21, Issue 2 (October 2010):1-19. Here cited from: http://jetpress.org/v21/sorgner.htm
  

Sunday, July 29, 2012

Immortality, Mitochondria, and Media Futures

With the idealized expectation of the technological 'rapture' goes a vision of technological oversimplification that is not quite a result of our being closer and closer to a future we once imagined.
 Said otherwise, talk of 2045 was, once upon a time, talk of some unimaginably distant era, as was talk of 2012. Or indeed 1998—which was indeed and to be sure the supposedly "future" time-period of the 1968 American television series Lost in Space.
  To see this it is worth thinking a bit about Aubrey de Grey, a software developer or programmer who, having learnt sufficient biology for the purpose[1] has been arguing that we can resist aging if we avoid its causes, to wit the oxidation of cells and the build-up of waste-products in those same cells. 

Having determined that it is the mitochondria that develop problems or ‘damage’ by getting gunked-up (or losing ‘efficiency’), de Gray proposes that we send in little nanobots to clean them out (or indeed, as de Grey also imagines, as so many mechanical replacements for what are clearly less-than ideal organelles). 
 
What de Grey has in mind is close to the miniaturized spaceships of Fantastic Voyage,the 1966 film of Raquel Welch’s travels on a microscopic level, which film title just happens to accord with one of Kurzweil’s first books for his ventures into technological rapture.  


De Grey not only runs an anti-aging foundation (and one supposes that he has all manner of highly motivated and well-heeled investors backing him) but also has an appointment on the faculty of Kurzweil’s Singularity University), straddling as he does both sides of the biotech and computer tech industry.

But for a critical overview that also applies to Kurzweil’s prediction of the coming ‘technological singularity,’ see Richard A. L. Jones, who is a professor of physics at Sheffield University, “Rupturing The Nanotech Rapture,”  IEEE Spectrum (June 2008): 64-67 and see further Jones’s earlier, Soft Machines: Nanotechnology and Life (Oxford: Oxford University Press, 2004).

And yet, as it turns out, de Gray's pitch, like Kurzweil's own, is less about biology than technology and marketing, precisely in the way we relate to technology as those who have, as fully vested heirs of a cargo cult, grown up with devices we know how to use from electric appliances, toilets (to be Illichian here), televisions and computers, cell-phones and coffee-makers, automobiles and airplane travel, but could not ourselves fabricate if our lives depended on it (this is the ominous and recurrent subtext of the future-as-desert film genre, like Road Warrior or Mad Max or Bladerunner and even short story turned film, The Hunger Games). Assuming, as we do, that someone else makes the tool, or writes the code for our app idea, i.e., assuming that some factory actually deploys the technology, the gadgets are what it is all about.  


[1] Although de Grey does not have a post at Cambridge University and there was a certain understated scandal associated with the implication that he did have one, he does hold a doctorate from Cambridge for his The Mitochondrial Free Radical Theory of Aging (Austin: Landes Publishing, 1999). See also Denham Harman, “Aging: A Theory Based on Free Radical and Radiation Chemistry,” Journal of Gerontology, 11 (3) (1956): 298-300.

Friday, July 27, 2012

Hence with all the troubles facing hard science, soft science, the science of clouds and apps that is the stuff of the coming technological rapture, vague as it is, may promise more success.  
For another take on the matter, see The Singularity Hypothesis, a discussion that lays claim to assess the matter scientifically inasmuch as the authors (hi! Eric) are keen on the same.  Rah!

Can’t get Apple and Lenovo (IBM) or some other PC to play right? Buy a Mac, say the experts (and then get ready to buy a lot of software, all the stuff you already have, once more, and innumerable times more, because what Apple really excels in is getting its customers to like buying stuff, apps, upgrades, software, tech time, etc.) Make a virtual machine, dual boot it (at least for the minority still capable of doing that these days).   A recent tech tip for those who wish to sweat the details (most folks would rather switch than fight...) is here (though if you read this PC World article it turns out that it too is all about switching...) or here.  Or because, like the two party system in the US, it is really only a matter of profit (i.e., there is no feud between Apple and Windows) here.


Apple and IBM still won’t play right but you won’t know it.


Linux operating systems are not the answer because Word, which is arguably the touchstone (no one can handle WordPerfect, which has given up and become a Word impersonator as a consequence) is not the same as Open Office. In fact, Word on a Mac and Word on a PC (I bristle at this because what are Macs if they are not PCs, toasters that don't toast? jetpacks without rocket fittings?)  does not give one identical results, although you need to look at the print results to note the difference (so make a PDF and minimize it, it’ll still be there, but coherent unto the file you crafted without the changes introduced by the new platform: WTSIWTG). So let’s all go blame Microsoft as if it were the great evil that besets your technological woes (slow computer? get a Mac!) but the problem is that hardware makes a difference for what software you can use. Your screen makes a difference (as Mac knows all too well), your computer/software settings make a difference (whether known to you or not) and now Google and Facebook and Twitter other bubble protocols to go with your television programming also makes a difference. The trick will be figuring out what kind of difference that difference makes.

Or maybe, owing to our own contouring of our own consciousness to the limits and constraints of the digital interface, be it that of email or of gaming or of the increasingly ubiquitous social networking (Facebook now appeals to the young, and the old and everyone in between, despite the social horror that it is for teens to ‘friend’ their parents), we increasingly find the flatness of computer enhanced experience exactly as charming as its purveyors claim.  

Here we note the very specific (and very popularly Nietzschean) “faith” in science and especially industrial, corporate, capitalist technology, if we read our Sloterdijk aright, or better yet if we read Guenther Anders, Theodor Adorno, or even Ivan Illich, we also know that same industrial, corporate, capitalist technology has been with us since the interregnum between the two wars. But this is again and also to say that such a vision is fascist through and through.  All this gives us is another reason to prepare for the coming singularity. And as with other raptures, one does not expect to have a choice. And one thinks this no matter how underwhelming the experience turns out to be in fact.

Like Conrad,[1] the object of girl-fan affection in a bygone musical, we “love” our iPhones — O yes we do. Here what matters is not affect as much as brand loyalty — O Conrad, we’ll be true. Even with all its limitations, we are happy to say: O iPhone, we love you.
There is a lot published on this, but see Jonathan Franzen’s op-ed piece, “Liking Is for Cowards. Go for What Hurts,” New York Times, May 28, 2011.

[1] I owe this reference to Tracy B. Strong who persisted in singing this for no apparent reason day and night while I was writing this essay. And repetition, any repetition, affords rather the same propaganda effect as a commercial.

Thursday, June 21, 2012

Technology, qua transhumanist hype-conventionality, has an ever growing appeal, more than the vision of the robotics of the Asimovian past, and this may be, perhaps, traced to certain persistent limitations in cognate fields.  


 Practically minded as this author is, I like to suppose that this may be because the biological business of genetic engineering, retro fitting genes, and such like, has not been going as well as anticipated —  this is on the reality side of things — arguably, or perhaps owing to the pesky detail that genes work badly on the model of add-a-gene-and-stir varieties of genetic engineering but also that cloning adult organisms seems to produce young organisms that senesce and die markedly faster than young organisms usually do, no matter whether they be sheep or mice or cats or Korean puppies for the clone-your-Shih-Tzu market, your Fluffy market, Woolly mammoth or what have you.  Because where decimating the female dogs needed for the cloning business hardly matters in terms of time, count thirteen months and a lot more space and suffering for the elephants.  Because we can. 
However badly we do it, we can.  Deaf cats and all.


Forgetting about our fascination with "upgrading" ourselves, improving ourselves, we are all sure that we'd do so much better with that second chance, especially after a make-over, and all the born-again religiousity of the same, cloning dogs is probably the most practicable because abuse-preconditioned, abuse-predicated. 

The dog market in science is like the dogmeat market, except that it is the most egregious, and thus and at the same time seemingly most harmless. Brought to you by South Korea, like North Korea home to an industry of dog torture of all and every kind, culinary, fur and leather industry, and of course, the biggest abuse industry of all: that of science.

For what science does to dogs, and has done from the start with Claude Bernard and his enduring contribution to vivisection (called "research") does not bear thinking about. 


So we don't think about it. 

Thus we ignore, rather especially as philosophers interested in moral questions, the ethically catastrophic dog cloning commercial enterprise, speaking not of whether one should but of the consequences for those who do, quite apart from the dozens and dozens of dogs killed to ‘manufacture’ this one quasi-identical dog — but what is identity? the philosophers ask?  As it was the philosophers, think of Descartes, who empowered animal vivisection from the start with the egoist, humanist distinction between animal or beast and the human being's scientific bestiality. 
Scientific bestiality? Or should we just say bestialism? 



Sunday, May 27, 2012

Sloterdijk From Cyncism to Life Changes


At the end of Peter Sloterdijk’s Critique of Cynical Reason, Sloterdijk is able to argue that futurists (like Toffler and McLuhan, his references but also like today's Kurzweil), are dependent upon an earlier generation of thinkers, not even cold war, 50's and 60's thinkers and certainly not 70's thinkers like Vinge) but pre-World War II & WW I thinkers, like Friedrich Dessauer, but also Walter Rathaus, and Adrien Turel in an uncanny context that was the crucible for the particular fascism that grew out of the Weimar Republic on Sloterdijk’s account. 




Note too that Sloterdijk was somewhat younger when he wrote this book.  Does this make a difference? Perhaps but Sloterdijk was both more optimistic and less compromising at once.



Today we have a different set of concerns -- change your life, but heavens not the banking system, never fuss or muss with capitalism, don't question the lies you know the government told you, and why worry about those you don't know...


Some will find it hard not to think of Kurzweil’s (or better said, to respect the interest of copyright, Vernor Vinge’s?) “technological singularity” or others will purr about proactionary ventures for what I above have already opted to name, via Star Trek, the machine-human mind-meld, when Sloterdijk reflects upon his Rules for the Human Zoo noting that 
its strong epistemological linkage between concepts like ‘Dionysian materialism’ and ‘vitalism, a linkage made even more interesting by the fact that the life sciences and life technics have just passed into a new phase of their development. 
Here it should be noted that such references to the life sciences also point to the coming age of terror: in the name of transhumanism, in the hunt to jimmy the body with new replacement parts, there is an age of vivisection undreamt of --- and the university's biological sciences and psych buildings already drip red with the blood of their victims, almost as incarnadine as a medical or veterinary school.  A student learns one thing, the pain of the animal, the pain, always call it discomfort of the patient, only gets in the way.


Transhumanism is NOT ABOUT the peripherality of the human, or the non-centrality of the human. It is about expanding the human via technology and above all using animals and exploiting the earth ever more thoroughly, ever more completely for human purposes.
 

This monkey with baby is constantly hurt.  Hurt by science, by scientists and technicians for science' sake.  This is your tax dollars ar work. This is the work of the academics you admire, quoting their dicta the way medieval scholastics used to quote scripture, worse: because the scholastics raised questions -- and we don't.  

Most scholars could care less about animals. Even those who write on them.  Theory is theory: no fuss, no blood, please: we're writers. Most academics are even less concerned about nature: all they think matters is that everybody get together and agree (as if this were the key) that there "is" global warming. 

Those who call themselves Frankfurt school theorists assert that there is/can be no nature, because for one's purposes there is not: there is only us. There is human need, greed, caprice.


And this is true of the new theory, same as the old theory.
Beyond the debate internal to the politics of German public intellectuals, the theme for Sloterdijk is anthropotechnics: the technique of the manufacture of humanity, and it is not a German but a global concern:
 As Sloterdijk puts it:
Nietzsche and Plato have invited themselves to the ‘symposium’ to comment on the ideas of Heidegger, to put forward their opinions on the drama played out in the clearing. The title of this drama? Anthropotechnics or: How human beings produce themselves. And suddenly everyone wants to be invited, everyone — dramatically — wants to be part of the debate, to take part in it.

Sloterdijk’s point is increasingly relevant (though the Heidegger scandal will make all such references impolitc henceforward) and the message of Kurzweil’s vision of the ‘technological singularity’ as it has been embraced by (at least some elements of) popular culture, when it is not the message of the genome project or stem cells, is indeed anthropotechnics, which is all about not becoming the one you are but, and to be sure becoming the one you wish you were, the one you ‘should have’ been all along.  

Call this the Harry Potter effect, or everyone is a boy wizard, quidditch player, best in sports, all secret greatness and unfair discrimination, at least, in the germ, at least until after the singularity.  
Just as we have been transhuman all the time that we have, in Bruno Latour’s words “never been modern,” it can and has repeatedly been claimed that everything will be perfect after the revolution. For Marx, this was the revolution he famously failed to locate rightly, not in his industrial England or even in his Germany but and however disastrously and unsustainably where it did change the world in Russia and (still ongoing) in a China that is today increasingly indistinguishable from a capitalist regime, just ask the international financier Maurice Strong, or for the same answer from a different source, ask Žižek. 



Apart from Marx, and closer to home, the “revolution” that was promised to change everything, at least when I was eleven going on twelve, was a socio-cultural, leftist revolution, that was the revolution of the 1968 generation as it played itself into nothing but the idols of the market, lots of music, drugs, distractions of sex and the need to announce one’s erotic orientation to the world. So we ask, which revolution?  Ah, the technological one, of course. And who announces this but those who market the same? The technological singularity turns out to be not unlike a Coke commercial.  

We are the world