Wednesday, January 5, 2022

Metapolitics

Does your program have too many dependencies/relations with the outside environment?
 
Recently a question appeared on a prediction forum asking whether a nationalized AGI research group would arrive at AGI before the private sector. This is interesting because most AGI research is done privately at the fringes of AI community. Whether or not, and how well, dedicated nationalized organizations would be pursuing AGI is in some respects a little ill concerting as this has a metapolitical component - it agrees implicitly that AGI is great, it only differs on in whose custody it is indeed really great. 
 
Perhaps it is too early to speculate over how the computing and communication technologies have altered, or could alter, the size and quality of a state's political organization. Political organizations unfortunately are not software, you cannot simply download an upgrade to patch some issues and load even more additional features and functionalities. People will experience anxieties when structural changes lead to institutional changes, some of them will resort to violence, and then that violence will become a negotiation tactic for the politician. This. Happens. Everytime. In that sense the people are also a strategic dummy.  
 
Davidson & Rees-Mogg predicted that the computing and communication revolution would cause organized predatory violence to slip out from central control, and everyone except politicians will benefit from the death of politics. But they have their biases, I have mine. In August 1991, the hard-line coup plotters couldn't shut down Yeltsin's communications in Moscow because he had just acquired a cool new technology - the mobile phone. The history consistently suggests that computing and communication technologies have largely benefited the politicians. Carroll Quigley, writing in Weapons Systems and Political Stability, approached technologies in terms of their defensive/offensive nature - suggesting more that offensive power means larger, more intense political organization. Unfortunately, he died before his book could reach the computing age. Artificial Intelligence for example, could be described as a centralizing force, but a technology like that could hardly be assessed as inherently offensive or defensive. How will it then affect the political organization of modern nation-states if instead of the Deep Minds and the OpenAIs of the world, a nationalized organization happens to be the first one to develop AGI? 
 
Most of us live in a security bubble detached from the blood and gore of war, but it takes not long for things to slip back - Bosnia and plenty other places have proven time and again that the law is a consensus and no consensus lasts long enough to become The Law. That is why even if we hate them passionately, society needs politicians to negotiate that consensus on behalf of the crowd. Would an AGI make any difference to this, I can't really say. And about that AGI, well, its uncertainty is not the same as its low probability. The actual nature and requirements of AGI are yet to be discovered and its behaviors are yet to be added. Perhaps yet another job post seeking software architects with 108 years of ML experience needed for "extensible design" is in the works.