The following is excerpted from Data Rules: Reinventing the Market Economy by Cristina Alaimo and Jannis Kallinikos. Reprinted with permission from The MIT Press. Copyright 2024.
The perception of data as technical items does not do justice to the semiotic, epistemic, and communication functions that data perform in economy and society. It is reasonable, though, to wonder to what degree the technological nature of digital data and the formalized operations embodied in software systems and devices by which they are produced and managed shape these functions. This is, no doubt, an intricate question that echoes vexed issues of content versus form or medium, recurrent in the history of communication, in arts, semiotics, and cognition more widely. The medium is never an innocent carrier of the content that it conveys; rather, it is variously involved in shaping it, a condition that has found its most memorable expression in Marshall McLuhan’s hyperbole “the medium is the message”.
The facts that digital data mediate are undeniably shaped by the technical prescriptions by which they encode the life incidents they capture, the formal rules and principles of large data repositories (databases), the standards and protocols required for their transmission, the metrics that make specific data visible, and the hardwired functionalities of software systems through which they are produced and shared. These operations, furthermore, are contingent on the inexorable logic of bitstrings and how lower-level computing operations support higher-level, semantic functions. All these formal prerequisites compromise content specificity or detail (context) for recognizability, retrieval, and other similar cognitive benefits. In addition, data are often generated on the assumption that they matter in conjunction with other tokens in a large enough scale to disclose novel configurations of similarities and differences (usually expressed as metrics). As the practice of recombining and repurposing data across domains diffuses, the variety of circumstances that these massive data volumes bring about necessitates additional formalization and standardization. The growing institutional immersion of artificial intelligence (AI) applications and emerging technologies further reinforce these trends.
All these conditions are undoubtedly responsible for..
See the entire article:
http://insideainews.com/2024/05/13/book-excerpt-data-technology-and-algorithms/