In my last blog, I discussed the effects of WEIRD psychology on attitudes to Artificial Intelligence (AI). WEIRD societies (Western, Educated, Industrialized, Rich, Democratic) have been shown to demonstrate very different approaches to trust than non-WEIRD societies. WEIRD societies have substituted trust in friends, families, and communities with trust in institutions that have enabled widespread trading with strangers.

Ipsos surveys and subsequent research show that WEIRD societies are more fearful of AI and less optimistic about its prospects than their non-WEIRD counterparts despite having some of the most sophisticated industrialization and institutions, very highly educated populations, and substantially higher wealth. The differences did not arise from a lack of AI awareness in non-WEIRD societies, as the non-WEIRD samples came from populations most likely to have encountered and be using AI.
The subsequent research confirmed that the most significant factor in explaining the lower levels of trust, confidence and understanding in WEIRD countries was the sophistication of their democratic institutions. This is consistent with the hypothesis that in WEIRD countries, the reliance on institutions to substitute for interpersonal observation and trust extend beyond classical commercial instruments such as warranties and guarantees and corporate governance rules to include legislative provisions and regulatory institutions. On the one hand, these institutions provide assurances that strangers must abide by a set of government-enforced rules when transacting with each other. On the other hand, it begs the question of where, in democratic environments, the motivation comes to create those rules in the first place.
Arguably, in democratic societies, calls to legislate come from the governed, when it becomes clear that commercial and interpersonal social arrangements are insufficient to provide necessary confidence to engage and exchange. Yet, if WEIRD psychology has engendered a shift in trust from social and commercial to the legislated instruments, in a feed-back effect it would seem that eventually regulations would be needed before new technologies will ever be used. The lower levels of trust of AI in WEIRD countries could very well be because, at the time the survey was taken, there were no regulations governing AI enacted in any of the developed economies. Neither were there in the non-WEIRD countries either, but the non-WEIRD citizens would not have been looking for such assurances before engaging with AI.
That is, are the calls for AI regulation (which feature largest in the countries where AI has been developed and most extensively used) a feature of democracy itself, rather than primarily as a rational response to evidence of harm? It is telling that in 2023, the year following the exponential take-off of large language models such as ChatGPT, the number of incidents recorded on Stanford’s HAI AI Incident Database increased by only a third over the preceding year—hardly compelling evidence of significant harms requiring legislated constraint. Is regulation being demanded in more democratic countries primarily to assuage citizens’ fears of an unknown technology? Such calls do not appear to feature prominently in the non-WEIRD countries where, rather, technological optimism abounds.
Quite coincidentally, the Ipsos data provided one question which enabled a deeper examination of the question of trust and legislation in both WERIRD and non-WEIRD jurisdictions. Respondents were asked to express their confidence that AI firms would protect their personal data. While democracy was negatively correlated with this question, the coefficient’s effect in a regression equation was small, especially compared with other questions in the survey. This could have been because in Europe at least, privacy regulation (in the form of the General Data Protection Regulation (GDPR), enforced from May 25 2018), may have been influencing responses. The subsequent analysis, when controlling for European Union membership, found that this was positively, not negatively correlated with privacy responses. Being an EU member increased confidence in a country’s response to trust in personal data being protected by around six percent, whereas an increase of one percent in the country’s the Economist Intelligence Unit’s Democracy Index decreased it by three percent. Notably EU members in the sample include non-WEIRD countries Hungary, Poland and Romania in addition to the WEIRD members.
As there is much debate about how effective the GDPR may or may not be, analysis of the Ipsos data appears to confirm that the presence of regulation alone alters perceptions of trust in both WEIRD and non-WEIRD jurisdictions. It may be tempting for WEIRD legislators to regulate new technologies specifically to overcome fears in their populations due to uncertainty, even before its effects are known. However, this would be putting the regulatory cart before the technological horse. Regulation should address real harms, not assuage fears.
The post WEIRD Reactions to Privacy Regulation appeared first on American Enterprise Institute – AEI.