Approaching Ethical Dilemmas in the Digital Public Sphere

How should we handle crowdsourced knowledge that may be inaccurate or misrepresentative? What do we do when technology can fabricate sources that don’t actually exist?

In 2006, historian Roy Rosenzweig wrote cautiously, but with some optimism, that history could be crowdsourced through sites like Wikipedia.  He notes that one of Wikipedia’s most exciting features is its creative commons licensing means that anyone can republish it and it, therefore, amplifies historical narratives. Indeed, Wikipedia’s information can be found again and again across digital spaces.  Rosenzweig also found Wikipedia’s factual information highly reliable.  Beyond that, however, Wikipedia is shaped by biases.  It is western centered, and most of its editors come from English-speaking countries. Its volunteer editors are often amateur experts on particular topics knowledgeable about facts but not context and historiography.[1]

When Wikipedia is inaccurate, it gets corrected quickly, but when it is misrepresentative, based on incorrect or incomplete facts and sources, then its amplification of particular narratives can be problematic and dangerous.

A 2021 Wired article about Ksenia Coffman’s efforts to edit the Nazi SS pages to correct what “seemed like a concerted effort to look the other way about Germany’s wartime atrocities.”  Pages on the Nazi SS included photos that look like glamor shots.  False stories about officers trying to help Jews were attributed to one source and taken out of context. A page on an SS tank division was based almost wholly on unreliable sources. Coffman’s edits became the focus of controversy as the gate-keepers (the original editors on these pages) sought to present what they regarded as neutral facts and stores stories of the soldiers, but they did so as if those men “existed outside the genocidal Nazi cause.”  Coffman ultimately began to use Wikipedia to discredit the sources that the SS pages were based on.  Once those sources were marked as full of bias, she could then remove the claims praising Nazis.  Her work is praiseworthy, but how many of us have time and energy (her work grew to 5,000 edits per month) to devote to policing Wikipedia entries?

 I have done minimal Wikipedia editing. University of the Pacific’s Library and History faculty and students had a Wikipedia-edit-thon at UOP right before Covid hit to highlight items in our special collections.  That was fun and we didn’t run into recalcitrant editors.  My experiences working in my own field have been tougher, and I just can’t commit the time and energy to these battles.  The Camp Fire webpage (formerly Camp Fire Girls) is on the whole very accurate, but it calls the organization a co-educational organization.  I tried to change it to “all-gender organization.”  This is a neutral point. It is in fact how its current leadership refers to it.  My edits are changed within hours.  I picture an overzealous alumna guarding the gender gates, but of course, I have no idea who patrols my edits!  After two weeks I was resigned to including the information in my book on Camp Fire, which can then be added to the page as a reference. Creating robust references is one way that we can provide the sources for people to fact and bias check for themselves.

A separate issue is that our technologies allow us to fabricate sources that don’t actually exist. No perfect solution exists to combat fakes, but archivists and historians have developed tools to ferret out forgeries, to teach the importance of provenance and corroboration, and to question the reliability of sources.  Our most likely solutions will be found in creating and maintaining professional standards about born-digital objects that tag questionable materials and allow researchers to corroborate materials and teach about how to read sources critically. Nicolas Taylor, former program manager at LOCKSS Program (Stanford), says he “endorse[s] a particularly paranoid approach to digital preservation.” LOCKSS stands for “lots of copies keep stuff safe.” LOCKSS open-source software enables distributed preservation across libraries and academic institutions to best ensure their permanence.[2] This seems to be good policy. If archivists can keep lots of copies, then future scholars can decipher fakes, and they can even study the current problem of disinformation itself. In addition, those of us who teach history to general education and history students must diligently teach how to critically assess digital materials and how to find documents that corroborate the digital sources that they are finding.

[1]  Rosenzweig, Roy. “Can History Be Open Source? Wikipedia and the Future of the Past.” Originally published in The Journal of American History 93, no.1 (06, 2006): 117-46.

[2] Melanie Ehrenkranz, “How Archivists Could Stop Deepfakes from Rewriting History,” October 16, 2018, Gizmodo,

Leave a comment

Your email address will not be published. Required fields are marked *