Imagined Worlds

Picked up a book by Freeman Dyson (of Project Orion fame - the original "get to space by sitting on top of a nuke" plan). I've read things by and about Dyson and his fairly emminent children before (The Starship and the Canoe, for example); it's a unique family, to say the least!

This book - Imagined Worlds - is interesting, but not life-altering. It's scattered at best; adapted from a series of lectures, there isn't a coherent thread or direction in the book. And while each chapter asks intriguing questions, even on a micro-scale it doesn't come across as much more than rambling around a simple point.

The quick summary of the book:
  • Science has and will have a profound affect on human development.
  • People tend not to learn from past mistakes.
  • Technology evolves through trial and error; if you take away competition and/or the opportunity to try and fail, all you get are more collosal failures later on. So the most successful technologies - long term - will be those that struggled against competition - therefore, usually grass-roots rather than big budget items protected from short-term failure by governments/corporations.
  • Things will be different in the future, and more different the farther into the future you go.
  • Radio-based telepathy and group-consciousness might be really cool.
  • Computer-based design is cool.
  • Biotech is probably the Next Big Thing.
  • War is bad.

If you infer from that list that I thought the book had one useful learning and a lot of fluff, you'd be right.

The book didn't live up to my expectations - my fault, not the author's. Now a book that attempted to be a closure over those key ideas would be interesting. As it stands, discussion of how life might be a little culturally different in a hundred or a thousand years comes across as naive when juxtaposed wih a discussion of how people might be able to custom design their bioengineered pets or children on a personal computer in the almost immediate future.

At least Dyson points out that even if some people don't use a technology like bioengineering, some probably will. Where he doesn't go is that the groups that do use new technologies will change and adapt so much more quickly than those who don't that the abstainers will be marginalized at an ever increasing rate to the point where they simply don't amount to much in the vast scheme of humanity.

I will say that the introduction will stick with me for a long time - this points to H. G. Wells' The Time Machine as a look at an attempt to "stabilize" humanity. There are three attractors for the chaotic development of humanity over time: extinction, stability, and transhumanity. People tend to disbelieve in extinction (to afraid) and transhumanity (to hard to imagine), but stability is by far the most terrifying to me. Now I have a well known reference point to use in discussions.

Comments