The other day, I was reading “G. Goldkuhl (2013) The IT Artefact: An ensemble of the social and the technical? – A rejoinder”. Here’s what I’ve learned:
Goldkuhl points out the non-physical effects that physical artefacts might have:
“Even if artefacts in these theories [of intentionionally creating things] are considered as physical entities, it is important to note that their functions are not restricted to material influence. Other functions as social and aesthetic functions are also acknowledged”
Moreover, artefacts of a non-physical nature might exist:
“Even if an artefact does not need to have a physical existence [according to Lee], it needs to have some separate and enduring existence and it should be brought into existence as a result of some intentional making of humans.”
Hereinafter he focuses on physical artefacts with social influence, “social artefacts” [K], by considering “computational artefacts” [T] in human contexts.
“When looking at ovens, it is their capacity to produce heat that is the essential function. When looking at IT artefacts, the most important trait is their capacity to mediate communication between people
However, we do not need to put humans inside the boundary of the IT artefact in order to make these artefacts social.”
This, of course, completely ignores the existence of computational artefacts in purely technical contexts, usually referred to as embedded systems. However, the underlying idea in social and technical contexts is pretty much the same: extract the information processing part and embed it as a separate artefact (embedded system) in the overall system.
Let us stick with this “embedded” aspect here, as we see it as the main point of the work. We have computational artefacts embedded in a technical and/or social environment. Take, for instance, a vending machine, that has a unit triggering the dispenser and which can process the user input from the key pad. So far, it is the same idea, but how does it stand up to a closer look?
- Obviously interfaces to humans and machines are considerably different. Where CA to machine is down to electrical impulses, human CA interaction still requires clumsy devices like keypads, mice, screens etc. But notice that this is a physical aspect of CA, not an informational one.
- Machines (only) do as they’re told. Except in case of mechanical defects, machines execute the orders they are given by the CA. However, since there are no such things as pop-up boxes asking “Are you really sure? – Yes, No, Abort”, all circumstances must be considered beforehand.
- Humans have common sense (and other problems). Apart from misunderstandings caused by the user interface (i.e., when physical difficulties affect the information processing), unexpected effects can stem from the nature of a human as an information-processing social being.
On the upside, however, considering information from a CA with human common sense might save us from fooleries like commencing another world war (this has actually happened). [S]
Altogether, we see that CAs in social and technical contexts differ not only in terms of their interfaces:
“It is important to see that social structure (to use this term from Orlikowski & Iacono, 2001) is inscribed into the IT artefact.”
Summing up, the term “embedded system” applies to CAs inside technical as well as social systems. However, a CA in a human context is not just a CA from a machine context with a cuter interface.
Practitioner’s takeaway: Often this social/ technical distinction can tell a lot more about the character of a software project, than just its line of business or technology. Makes it a helpful clarification in every project abstract or summary.
[K] P. Kroes (2012) “Technical Artefacts: Creations of Mind and Matter”
Series: Philosophy of Engineering and Technology (6), Springer Dordrecht
[T] As discussed in:
R. Turner (2018) “Computational Artifacts (Theory and Applications of Computability)”
Springer Berlin Heidelberg
[S] A great read on this is: B.C Smith (1985) The Limits of Correctness