20 found
Order:
  1. Algorithmic Decision-Making Based on Machine Learning from Big Data: Can Transparency Restore Accountability?Paul B. de Laat - 2018 - Philosophy and Technology 31 (4):525-541.
    Decision-making assisted by algorithms developed by machine learning is increasingly determining our lives. Unfortunately, full opacity about the process is the norm. Would transparency contribute to restoring accountability for such systems as is often maintained? Several objections to full transparency are examined: the loss of privacy when datasets become public, the perverse effects of disclosure of the very algorithms themselves, the potential loss of companies’ competitive edge, and the limited gains in answerability to be expected since sophisticated algorithms usually are (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   31 citations  
  2.  43
    Companies Committed to Responsible AI: From Principles towards Implementation and Regulation?Paul B. de Laat - 2021 - Philosophy and Technology 34 (4):1135-1193.
    The term ‘responsible AI’ has been coined to denote AI that is fair and non-biased, transparent and explainable, secure and safe, privacy-proof, accountable, and to the benefit of mankind. Since 2016, a great many organizations have pledged allegiance to such principles. Amongst them are 24 AI companies that did so by posting a commitment of the kind on their website and/or by joining the ‘Partnership on AI’. By means of a comprehensive web search, two questions are addressed by this study: (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  3.  82
    The disciplinary power of predictive algorithms: a Foucauldian perspective.Paul B. de Laat - 2019 - Ethics and Information Technology 21 (4):319-329.
    Big Data are increasingly used in machine learning in order to create predictive models. How are predictive practices that use such models to be situated? In the field of surveillance studies many of its practitioners assert that “governance by discipline” has given way to “governance by risk”. The individual is dissolved into his/her constituent data and no longer addressed. I argue that, on the contrary, in most of the contexts where predictive modelling is used, it constitutes Foucauldian discipline. Compliance to (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  4. Trusting virtual trust.Paul B. de Laat - 2005 - Ethics and Information Technology 7 (3):167-180.
    Can trust evolve on the Internet between virtual strangers? Recently, Pettit answered this question in the negative. Focusing on trust in the sense of ‘dynamic, interactive, and trusting’ reliance on other people, he distinguishes between two forms of trust: primary trust rests on the belief that the other is trustworthy, while the more subtle secondary kind of trust is premised on the belief that the other cherishes one’s esteem, and will, therefore, reply to an act of trust in kind (‘trust-responsiveness’). (...)
    Direct download (10 more)  
     
    Export citation  
     
    Bookmark   11 citations  
  5. Open Source Production of Encyclopedias: Editorial Policies at the Intersection of Organizational and Epistemological Trust.Paul B. de Laat - 2012 - Social Epistemology 26 (1):71-103.
    The ideas behind open source software are currently applied to the production of encyclopedias. A sample of six English text-based, neutral-point-of-view, online encyclopedias of the kind are identified: h2g2, Wikipedia, Scholarpedia, Encyclopedia of Earth, Citizendium and Knol. How do these projects deal with the problem of trusting their participants to behave as competent and loyal encyclopedists? Editorial policies for soliciting and processing content are shown to range from high discretion to low discretion; that is, from granting unlimited trust to limited (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  6. How can contributors to open-source communities be Trusted? On the assumption, inference, and substitution of trust.Paul B. de Laat - 2010 - Ethics and Information Technology 12 (4):327-341.
    Open-source communities that focus on content rely squarely on the contributions of invisible strangers in cyberspace. How do such communities handle the problem of trusting that strangers have good intentions and adequate competence? This question is explored in relation to communities in which such trust is a vital issue: peer production of software (FreeBSD and Mozilla in particular) and encyclopaedia entries (Wikipedia in particular). In the context of open-source software, it is argued that trust was inferred from an underlying ‘hacker (...)
    Direct download (10 more)  
     
    Export citation  
     
    Bookmark   11 citations  
  7.  23
    Algorithmic decision-making employing profiling: will trade secrecy protection render the right to explanation toothless?Paul B. de Laat - 2022 - Ethics and Information Technology 24 (2).
    Algorithmic decision-making based on profiling may significantly affect people’s destinies. As a rule, however, explanations for such decisions are lacking. What are the chances for a “right to explanation” to be realized soon? After an exploration of the regulatory efforts that are currently pushing for such a right it is concluded that, at the moment, the GDPR stands out as the main force to be reckoned with. In cases of profiling, data subjects are granted the right to receive meaningful information (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  8. From open-source software to Wikipedia: ‘Backgrounding’ trust by collective monitoring and reputation tracking.Paul B. de Laat - 2014 - Ethics and Information Technology 16 (2):157-169.
    Open-content communities that focus on co-creation without requirements for entry have to face the issue of institutional trust in contributors. This research investigates the various ways in which these communities manage this issue. It is shown that communities of open-source software—continue to—rely mainly on hierarchy (reserving write-access for higher echelons), which substitutes (the need for) trust. Encyclopedic communities, though, largely avoid this solution. In the particular case of Wikipedia, which is confronted with persistent vandalism, another arrangement has been pioneered instead. (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  9. Coercion or empowerment? Moderation of content in Wikipedia as 'essentially contested' bureaucratic rules.Paul B. de Laat - 2012 - Ethics and Information Technology 14 (2):123-135.
    In communities of user-generated content, systems for the management of content and/or their contributors are usually accepted without much protest. Not so, however, in the case of Wikipedia, in which the proposal to introduce a system of review for new edits (in order to counter vandalism) led to heated discussions. This debate is analysed, and arguments of both supporters and opponents (of English, German and French tongue) are extracted from Wikipedian archives. In order to better understand this division of the (...)
    Direct download (11 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  10. NAVIGATING BETWEEN CHAOS AND BUREAUCRACY: BACKGROUNDING TRUST IN OPEN-CONTENT COMMUNITIES.Paul B. de Laat - 2012 - In Karl Aberer, Andreas Flache, Wander Jager, Ling Liu, Jie Tang & Christophe Guéret (eds.), 4th International Conference, SocInfo 2012, Lausanne, Switzerland, December 5-7, 2012. Proceedings. Springer.
    Many virtual communities that rely on user-generated content (such as social news sites, citizen journals, and encyclopedias in particular) offer unrestricted and immediate ‘write access’ to every contributor. It is argued that these communities do not just assume that the trust granted by that policy is well-placed; they have developed extensive mechanisms that underpin the trust involved (‘backgrounding’). These target contributors (stipulating legal terms of use and developing etiquette, both underscored by sanctions) as well as the contents contributed by them (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  11. Open Source Software: A New Mertonian Ethos?Paul B. de Laat - 2001 - In Anton Vedder (ed.), Ethics and the Internet. Intersentia.
    Hacker communities of the 1970s and 1980s developed a quite characteristic work ethos. Its norms are explored and shown to be quite similar to those which Robert Merton suggested govern academic life: communism, universalism, disinterestedness, and organized scepticism. In the 1990s the Internet multiplied the scale of these communities, allowing them to create successful software programs like Linux and Apache. After renaming themselves the `open source software' movement, with an emphasis on software quality, they succeeded in gaining corporate interest. As (...)
    Direct download  
     
    Export citation  
     
    Bookmark   3 citations  
  12. Copyright or copyleft?: An analysis of property regimes for software development.Paul B. de Laat - 2005 - Research Policy 34 (10):1511-1532.
    Two property regimes for software development may be distinguished. Within corporations, on the one hand, a Private Regime obtains which excludes all outsiders from access to a firm's software assets. It is shown how the protective instruments of secrecy and both copyright and patent have been strengthened considerably during the last two decades. On the other, a Public Regime among hackers may be distinguished, initiated by individuals, organizations or firms, in which source code is freely exchanged. It is argued that (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  13. The use of software tools and autonomous bots against vandalism: eroding Wikipedia’s moral order?Paul B. de Laat - 2015 - Ethics and Information Technology 17 (3):175-188.
    English - language Wikipedia is constantly being plagued by vandalistic contributions on a massive scale. In order to fight them its volunteer contributors deploy an array of software tools and autonomous bots. After an analysis of their functioning and the ‘ coactivity ’ in use between humans and bots, this research ‘ discloses ’ the moral issues that emerge from the combined patrolling by humans and bots. Administrators provide the stronger tools only to trusted users, thereby creating a new hierarchical (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  14.  50
    Big data and algorithmic decision-making.Paul B. de Laat - 2017 - Acm Sigcas Computers and Society 47 (3):39-53.
    Decision-making assisted by algorithms developed by machine learning is increasingly determining our lives. Unfortunately, full opacity about the process is the norm. Can transparency contribute to restoring accountability for such systems? Several objections are examined: the loss of privacy when data sets become public, the perverse effects of disclosure of the very algorithms themselves, the potential loss of competitive edge, and the limited gains in answerability to be expected since sophisticated algorithms are inherently opaque. It is concluded that transparency is (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  15. Profiling vandalism in Wikipedia: A Schauerian approach to justification.Paul B. de Laat - 2016 - Ethics and Information Technology 18 (2):131-148.
    In order to fight massive vandalism the English- language Wikipedia has developed a system of surveillance which is carried out by humans and bots, supported by various tools. Central to the selection of edits for inspection is the process of using filters or profiles. Can this profiling be justified? On the basis of a careful reading of Frederick Schauer’s books about rules in general (1991) and profiling in particular (2003) I arrive at several conclusions. The effectiveness, efficiency, and risk-aversion of (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  16. Internet-Based Commons of Intellectual Resources: An Exploration of their Variety.Paul B. de Laat - 2006 - In Jacques Berleur, Markku I. Nurminen & John Impagliazzo (eds.), IFIP; Social Informatics: An Information Society for All? In Remembrance of Rob Kling Vol 223. Springer.
    During the two last decades, speeded up by the development of the Internet, several types of commons have been opened up for intellectual resources. In this article their variety is being explored as to the kind of resources and the type of regulation involved. The open source software movement initiated the phenomenon, by creating a copyright-based commons of source code that can be labelled `dynamic': allowing both use and modification of resources. Additionally, such a commons may be either protected from (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  17. Trusting the (ro)botic other: By assumption?Paul B. de Laat - 2015 - SIGCAS Computers and Society 45 (3):255-260.
    How may human agents come to trust (sophisticated) artificial agents? At present, since the trust involved is non-normative, this would seem to be a slow process, depending on the outcomes of the transactions. Some more options may soon become available though. As debated in the literature, humans may meet (ro)bots as they are embedded in an institution. If they happen to trust the institution, they will also trust them to have tried out and tested the machines in their back corridors; (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  18.  80
    Emerging roles for third parties in cyberspace.Paul B. de Laat - 2001 - Ethics and Information Technology 3 (4):267-276.
    In `real' space, third partieshave always been useful to facilitatetransactions. With cyberspace opening up, it isto be expected that intermediation will alsodevelop in a virtual fashion. The articlefocuses upon new cyberroles for third partiesthat seem to announce themselves clearly.First, virtualization of the market place haspaved the way for `cybermediaries', who brokerbetween supply and demand of material andinformational goods. Secondly,cybercommunication has created newuncertainties concerning informational securityand privacy. Also, as in real space,transacting supposes some decency with one'spartners. These needs are being addressed (...)
    Direct download (10 more)  
     
    Export citation  
     
    Bookmark  
  19.  42
    Trusting the (ro)botic other.Paul B. de Laat - 2015 - Acm Sigcas Computers and Society 45 (3):255-260.
    How may human agents come to trust artificial agents? At present, since the trust involved is non-normative, this would seem to be a slow process, depending on the outcomes of the transactions. Some more options may soon become available though. As debated in the literature, humans may meet bots as they are embedded in an institution. If they happen to trust the institution, they will also trust them to have tried out and tested the machines in their back corridors; as (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  20.  70
    Online diaries: Reflections on trust, privacy, and exhibitionism. [REVIEW]Paul B. de Laat - 2008 - Ethics and Information Technology 10 (1):57-69.
    Trust between transaction partners in cyberspace has come to be considered a distinct possibility. In this article the focus is on the conditions for its creation by way of assuming, not inferring trust. After a survey of its development over the years (in the writings of authors like Luhmann, Baier, Gambetta, and Pettit), this mechanism of trust is explored in a study of personal journal blogs. After a brief presentation of some technicalities of blogging and authors’ motives for writing their (...)
    Direct download (10 more)  
     
    Export citation  
     
    Bookmark   10 citations