Principles of technology assessment platforms

Technology assessment platforms are built on three sets of principles: the precautionary principle (care); the right to know (transparency) and the right to say ‘no’ (consent) as well as those of participatory action research.

1) Care: the principle of precaution

Precaution is just about being careful and taking care. Today powerful interest groups promoting new industrial technologies misleadingly try to suggest that an attitude of care and precaution is somehow unscientific or anti-technology – that it will arrest the sort of entrepreneurial risk-taking needed for modern economies by enacting bans. But the precautionary principle, the logic of which lies behind the process of technology assessment, does not necessarily mean a ban on new technologies or stopping things being researched and understood. Enshrined in the legal system of numerous countries, such as those of the European Union, it simply urges that time and space be found to get things right using an appropriate range of perspectives. If in doubt, the enactment of the precautionary principle would tend towards policies that prevented the release of a potentially harmful technology.

To see the value of this, we can start by considering history. Take, for example, asbestos, lead, benzene, pesticides, ozone-depleting chemicals or industrial-scale fishing. In all these areas of technology (and many more), early precautionary action was dismissed as irrational by governments, business and scientific establishments alike who were fearful of losing advantage in an industrial race. They claimed there were no alternatives and that advocates of precaution were anti-science or against everyone’s need to benefit from Western development. Yet now, it is agreed on all sides of the debate that levels of risk were initially drastically understated and there were alternative pathways than the proponents of these technologies acknowledged at the time.

Applying the precautionary principle is also a means whereby those involved with science and technology can be reminded that innovation and progress is not a one-way race to the future. Instead, technological developments can take many paths. Though often concealed behind the banner of ‘science’, the development of each technology involves intrinsically political choices. Assessing the risk arising from a new technology requires an understanding of the nature of uncertainty. The precautionary principle states that where there is some evidence of potential threats to human health or the environment, scientific uncertainty about how strong that evidence is should not be a reason for those in power to avoid their responsibility to take action to avoid harm. Uncertainty does not compel a particular action, it merely reminds us that lack of evidence of harm is not the same thing as evidence of lack of harm. In other words, the crux of precaution lies in us taking as much care in avoiding the scientific error of mistakenly assuming safety, as we might in avoiding mistakenly assuming harm.

When it comes to uncertainty, it is not merely difficult in practice to calculate some single definitive “sound scientific” or “evidence based” solution. But these terms are often used in political ways to prevent questions by people with a wider range of perspectives than those with the narrow mindset of scientism. Good governance of science and technology means listening to questions from a range of expert perspectives, including those derived from lived experience, rather solely relying on those from professional experts. Uncertainties are among the most important driving forces in science. When scientific institutions come into contact with economic and political power, however, they find themselves becoming complicit in a strange kind of uncertainty denial.

2) Transparency: the right to know

In our increasingly technologically complex world there is a danger that almost all decisions of importance will be made behind closed doors. Experts connected to powerful organisations will make momentous choices using the justification that such judgements require specialised knowledge. The rest of us are excluded from influencing such decisions. The idea that the powerful technologies or science-based decisions that shape our lives are too complicated to involve the views of people outside a small group of experts can lead to power being handed to an ever-decreasing number of bureaucrats. Democracy, defined as the rule by the many, thus gives way to technocracy – rule by experts.

Regaining control over decisions that involve technologies requires helping a wider range of people to better understand the science and technological issues that are around them or are set to impact their own rights and living conditions. This includes making visible the presence of particular technologies (such as through labelling and awareness raising), creating easy to understand public information about their uses and abuses, sharing what is known about risks and uncertainties and acknowledging limits to knowledge about a technological development or scientific matter. This counters the myth that scientific experts alone know best. In this way everybody can bring their own knowledge and their own values and wisdom to bear upon how a technology is assessed and ultimately governed for the common good.

3) Consent: the right to say “no”

The right to say “no” or the conditions for saying “yes” to a new technology has been enshrined in several key United Nations conventions as a process of “Free, Prior and Informed Consent” (FPIC – see box). These four words describe the collective right people should have to understand all relevant dimensions of scientific research and technological development and, secondly, to potentially say no to either the research or the technological development, if they believe there potentially could be negative impacts.

Free Prior and Informed Consent

Free Prior and Informed Consent (FPIC) is not the same thing as ‘consultation’ or ‘listening’ or even ‘dialogue’. Any group committing to FPIC must take on the responsibility to implement each of its four elements:

FREE: Communities must be empowered not only to participate in the processes without coercion, but they must be at the centre of defining the processes themselves. That is why non-corporate and non-state actors must place an active role in supporting community participation and facilitating deliberative processes by which people can actively participate. They must be self-governed and independent of those developing the technologies. For example, it cannot be Free if the process is driven by the industry that is invested in deploying a new technology and they pick their own stakeholders.

PRIOR: All assessment processes should precede research, development or deployment and must become an ongoing process. That is, proposed technologies must be subject to a processes of community assessment at every stage. If discussions come too late, bad outcomes may already be “locked in”. Research into, development or deployment of a potential future technology cannot move forward simply because an organisation approved it at some point in the past. Knowledge changes over time. Each time new information comes to light the participatory process should be updated. Technology assessment should not simply be a once-and-forever “yes or no” process – it is a process of considering the economic, cultural, social, and political implications of technology on an ongoing basis.

INFORMED: All technology assessment must begin with a process of participatory research and inquiry into the potential impacts of a technology or group of technologies. This inquiry should not merely consider just the use of a technology for its stated purpose, it should also consider other possible uses and the underlying economic and political interests behind the development and deployment of a technology. Civil Society, non-governmental organisations, trade unions, peasant groups, indigenous peoples and local communities. are all critical to ensuring a robust and truly informed understanding of any developments in science and technology. The assessment process should also be informed by the diverse knowledge systems of those who are potentially impacted by a technology. This means that assessment cannot simply depend on a single narrow knowledge system, such as the viewpoint of Western scientists, but must embrace, for example, traditional ecological knowledge that exists in the communities that are potentially impacted. Often, technologies are discussed only in terms of a single stated or proposed use, with little examination of how the technology may play out in the real world or the externalized costs and consequences on communities. The kinds of questions that should be asked about new technologies must go beyond blunt questions of “does it work?” or “is it good or bad?”(1). A truly informed process will also deeply consider the uncertainties, including potential consequences that may emerge over time.

CONSENT: Consent to the development of a new technology can only be freely given if it can be taken away at any time. Consent is a process, rather than a one-off event, in which consent can be withdrawn, or can be re-considered, as new information emerges and new experiences add to our understandings of a technology. The process of technology assessment must begin with a default assumption of “No”, until consent is clearly given, if at all. Consent, in the context of community rights, must also be democratically inclusive, considering the interests of all, paying particular care to allow consent to those most impacted by a technology, especially those who are vulnerable. To be legitimate, democratic processes should be rooted in the cultural practices of the community, while maintaining radical inclusion (centering on those most impacted).

The principle of FPIC means that all these elements must be true before the area of scientific research or technology development goes ahead.

(1) These considerations should include:  How did this technology come into being? Who benefits from this technology and how? What practices will this technology disrupt? How will this technology impact our social relationships? Where do the resources come from to bring this technology into being, and how will they be maintained? What are the implications of this technology over time and space; across generations and communities?

Values of Technology Assessment Platforms

Case studies

Find Technology Assessment Platforms