∞ tsp of data
Countless hours of digitization
½ Supercomputer (or 4cups Cloud Computing)
1 petabyte of storage
dash of creativity
75 gallons of coffee
1.75L Wild Turkey (101)
budget… lots of budget
Preheat the coffee pot.
Cull for hours identifying targets to digitally preserve.
Scan, photograph, capture, and torture original sources for digitally preserved replicas.
Switch from coffee to whiskey
Realize you are in WAY over your head… run screaming to the hills and embrace your typewriter. Shimmy and shake, drink heavily, calm down and try again.
Pay someone to do something to get the project off the ground while wondering about the relevance of this to historical study.
Bake, survive a crash, learn about disaster recovery, recover, and present your treasure for the world.
Receive 25 hits on your site (4 from family, 10 from friends, 11 random accidents).
Set on fire.
Join a monastery, make beer, drink beer dream of life before electricity.
This seems to be the way to concoct a fine dish of informatics flambeau.
Our fine friends at Wikipedia offer the following somewhat verbose definition of Informatics: “Informatics is the science of information, the practice of information processing, and the engineering of information systems. Informatics studies the structure, algorithms, behavior, and interactions of natural and artificial systems that store, process, access and communicate information.”
Put differently, informatics is “a broad academic field encompassing artificial intelligence, cognitive science, computer science, information science, and social science.”
Informatics, knowledge management, Peter Norvig, Patrick Leary, and The American Council of Learned Societies Commission on Cyberinfrastructure for the Humanities and Social Sciences all seem to be chasing the same notion: Improve accessibility by connecting data and information with the right people. Web 2.0 is all about the data and connecting people and communities to that data. This is truly a daunting task that is wrenching social scientists from their comfortable piles of moldy books and manuscripts and throwing them in front of bleeding edge technologists. This is not a pleasant occurrence, as our class would most heartily attest.
Leary, Rosenzweig, Cohen, and others have screamed about the perils on either side of the straight and narrow path. From data inundation, sloppy results from easy publishing, veracity issues, copyrights and wrongs, cherry picking from what is easily available, to missing the opportunities of chance (ie browsing the stacks and finding that needle in the haystack), there are very real and legitimate concerns.
The same authors and a host of other evangelicals will proclaim the gospel of access and the troves of newly available data. This will only improve with time, they say. I tend to side with the evangelicals… BUT I lean heavily on the requirement to make science/technology work for us.
Present and future studies in history (and, I would argue, every field), much like modern production, will be driven by efficiencies, accuracy, and continuous improvement to the processes of research and publication. Here is where Peter Norvig comes in. Complex computer engines will provide what he called lexical co-occurrences, enlighten the offline penumbra, and connect researchers with a larger community and its data. BUT, beware to the researcher, today this is as risky as Columbus setting out across the Atlantic looking for the orient. Keep in mind, his mission, as ordered, was a complete disaster. The algorithms, programs, methods, and technology are all improving, but they aren’t there yet.
We are all cooking an informatics flambeau. The ingredients are volatile and the results are most definitely on fire. Historians cannot escape the drive to efficiency in research methods and output, but we cannot become experts either. Developing the technology required takes a lifetime of expertise and extremely detailed knowledge in quantum computing. The question is, how can we bridge the gap and become historians who affect the future of the tools we need and who influence the technology for our field?
The Clio I class is a great forum for the exploration of technology and makes a great proving ground for the tech-neophyte ( Newb not a n00b) but I am concerned that we are leaving some of the larger philosophical questions aside in our relative fear of technology. We have to understand the technology not to become developers, but to wield some of the tools and, more importantly, allow us to communicate at some level with the expert technologists.
Just some thoughts… mostly barking at the wind.