DeadGuyQuotes's Blog

American History in the Making

Sticking Your Head In The Sand

When reading The Access Principle by John Willinsky, I was particularly intrigued with the way Willinsky approached the topic of open access and politics. Essentially, the message is more information and access to scholarly research and evidence can and should inform the global, national, or local political policy debates. Ideally, members of the government, bureaucrats or politicians, should have access to the latest and best of academic research. More importantly, members of a democratic society should have access to the same. Realized, this brave new world would be filled with informed, reasoned debate. Journalism would live up to its ideals, and mysticism, emotion, and rhetoric would fall down to evidence and logic.

Sounds like the Reformation.

In fact, Willinsky references the impact of the printing press on the same event.

He bravely faces the critical issues surrounding this most noble ideal: context; context and the informed and capable public able to read the material. This is not to say that people aren’t intelligent enough, but there is a problem given at least the American society today. Willinsky points to it when he quotes Christopher Forrest, “The public reads the bottom line.” I will tell you from personal experience that bureaucrats, politicians, soldiers, and any government support personnel also read “the bottom line.” Massive and complex issues are dealt with in one-page summaries. Detailed and sensitive issues are handled in boiled-down bullets. Willinsky espouses a fantastic ideal, but reality still presents a problem.

I have previously expressed concern over the information age in that we have too much information and very few efficient and effective tools to cull through the mountains of data and conclusion. Opening all the doors to the ivory tower’s basement will further complicate the overwhelming sense of information overload. As a collection of academics, citizens, and servants we must work harder on good knowledge management tools and principles to better see the future that Willinsky calls for.

Until then, I may just play the role of ostrich…

Advertisements

November 24, 2009 Posted by | Clio I - History and New Media | 4 Comments

Informatics Flambeau

∞ tsp of data
Countless hours of digitization
½ Supercomputer (or 4cups Cloud Computing)
1 petabyte of storage
dash of creativity
75 gallons of coffee
1.75L Wild Turkey (101)
budget… lots of budget

Preheat the coffee pot.

Cull for hours identifying targets to digitally preserve.

Scan, photograph, capture, and torture original sources for digitally preserved replicas.

Switch from coffee to whiskey

Realize you are in WAY over your head… run screaming to the hills and embrace your typewriter. Shimmy and shake, drink heavily, calm down and try again.

Pay someone to do something to get the project off the ground while wondering about the relevance of this to historical study.

Bake, survive a crash, learn about disaster recovery, recover, and present your treasure for the world.

Receive 25 hits on your site (4 from family, 10 from friends, 11 random accidents).

Set on fire.

Join a monastery, make beer, drink beer dream of life before electricity.

This seems to be the way to concoct a fine dish of informatics flambeau.

Our fine friends at Wikipedia offer the following somewhat verbose definition of Informatics: “Informatics is the science of information, the practice of information processing, and the engineering of information systems. Informatics studies the structure, algorithms, behavior, and interactions of natural and artificial systems that store, process, access and communicate information.”

Put differently, informatics is “a broad academic field encompassing artificial intelligence, cognitive science, computer science, information science, and social science.”

Informatics, knowledge management, Peter Norvig, Patrick Leary, and The American Council of Learned Societies Commission on Cyberinfrastructure for the Humanities and Social Sciences all seem to be chasing the same notion: Improve accessibility by connecting data and information with the right people. Web 2.0 is all about the data and connecting people and communities to that data. This is truly a daunting task that is wrenching social scientists from their comfortable piles of moldy books and manuscripts and throwing them in front of bleeding edge technologists. This is not a pleasant occurrence, as our class would most heartily attest.

Risks:

Leary, Rosenzweig, Cohen, and others have screamed about the perils on either side of the straight and narrow path. From data inundation, sloppy results from easy publishing, veracity issues, copyrights and wrongs, cherry picking from what is easily available, to missing the opportunities of chance (ie browsing the stacks and finding that needle in the haystack), there are very real and legitimate concerns.

Rewards:

The same authors and a host of other evangelicals will proclaim the gospel of access and the troves of newly available data. This will only improve with time, they say. I tend to side with the evangelicals… BUT I lean heavily on the requirement to make science/technology work for us.

Motivators:

Efficiency.

Present and future studies in history (and, I would argue, every field), much like modern production, will be driven by efficiencies, accuracy, and continuous improvement to the processes of research and publication. Here is where Peter Norvig comes in. Complex computer engines will provide what he called lexical co-occurrences, enlighten the offline penumbra, and connect researchers with a larger community and its data. BUT, beware to the researcher, today this is as risky as Columbus setting out across the Atlantic looking for the orient. Keep in mind, his mission, as ordered, was a complete disaster. The algorithms, programs, methods, and technology are all improving, but they aren’t there yet.

We are all cooking an informatics flambeau. The ingredients are volatile and the results are most definitely on fire. Historians cannot escape the drive to efficiency in research methods and output, but we cannot become experts either. Developing the technology required takes a lifetime of expertise and extremely detailed knowledge in quantum computing. The question is, how can we bridge the gap and become historians who affect the future of the tools we need and who influence the technology for our field?

The Clio I class is a great forum for the exploration of technology and makes a great proving ground for the tech-neophyte ( Newb not a n00b) but I am concerned that we are leaving some of the larger philosophical questions aside in our relative fear of technology. We have to understand the technology not to become developers, but to wield some of the tools and, more importantly, allow us to communicate at some level with the expert technologists.

Just some thoughts… mostly barking at the wind.

— DGQ

November 9, 2009 Posted by | Clio I - History and New Media | 2 Comments

Digital Improvised Explosive Devices (DIED)

Ok… the acronym was an absolute accident, but hey, I’m with the Government, I am a card-carrying official acronym producer. I guess it is natural… or a gift…

This week’s reading really obviates the need for my project in some ways and really opens the curtain to the real issues surrounding digital tool sets. At the root, I am working a Text Encoding Initiative where I do a basic text capture, presentation, preservation, encoding and then some investigation into the power of metadata and the presentation of the text as data. But the problem is… and I suppose this is a legitimate concern across academia… why is my idea any better or different than anyone else’s?

Amidst the concerns of Rosenzweig’s excellent synopsis of the digital challenges and opportunities, how are professional historians supposed to move forward? I think the answer to both questions may be captured by Rosenzweig’s conclusion: “What is often said of military strategy seems to apply to digital preservation: ‘the greatest enemy of a good plan is the dream of a perfect plan.’ We have never preserved everything; we need to start preserving something.” As my efforts are targeted at low-budget, standards-based efforts this seems to fall into line with both the NINCH and Rosenzweig articles.

We must train ourselves in basic standards of historical method using the new tools so we can have any hope for effectively digging through the mountains of data that are emerging for historical analysis. Simultaneously, as the mountain of data is growing efforts must continue to ensure archivists and historians preserve the right documents and data. For historians studying governments, this can be a little easier, but still very challenging. NARA is one example of how little is actually being saved. Costs, legislation, and technology all impact how and what we save. But the historian wants to have the opportunity to look at it all.

The digital realm is covered in opportunities for success and dangerous mines ready to blow up the unsuspecting historian. These issues include technology, ownership, distribution, accuracy, preservation, cost, as well as myriad other dangers. Now is the time that these issues have to be solved. Rosenzweig points out that schools have to train their graduate students to grapple with the issues and even master them. George Mason University’s attempts at digital history are a great start, but leave many specific and highly particular issues at bay.

To paraphrase Rosenzweig, we have to start something digital.

November 3, 2009 Posted by | Clio I - History and New Media | , , | Leave a comment