Help


from Wikipedia
« »  
A " computer " doesn't generally care whether the answer to a problem is " 5 " or " 25 ", " yes " or " no ," but humans involved in a group decision-making process may care very much about the outcome, and the various answers can have winners and losers with potentially very high stakes.
At a minimum, this introduces substantial bias into the analysis of any data, as people will tend to selectively see facts that support the conclusions they personally prefer.
In a collaboration within a single hospital between multiple clinicians, mediated by an " electronic medical record ", there may actually be a substantial amount of dispute and negotiation going on among, say, a group focusing on treating diabetes and another group focusing on treating congestive heart failure.
The " collaboration " may actually be much more of a " competition " to frame and define the problem in terms that result in favorable outcomes.
Again, this makes CSC design work far more complicated than simply trying to get a group of sensors or computers to share data and work together correctly.
In fact, in some cases, participants may have a strong vested interest in the status quo and prefer as an outcome that the " problem " not be solved.
A successful CSC system, in their minds, would be one that prevented the solution of the problem supposedly being addressed collaboratively, perhaps while giving a misleading appearance of cooperative effort.
This factor complicates research into whether a CSC system is well-designed or not.

1.982 seconds.