If, as Karl Marx argued in ‘A Contribution to the Critique of Political Economy’ (1859), the material base of society shapes its superstructure, then the rise of artificial intelligence (AI) in warfare may transform not only the technologies of conflict but also the legal and moral frameworks designed to restrain it.
The emergence of lethal autonomous weapon systems, capable of selecting and attacking targets without meaningful human control, may represent such a moment of transition.
The danger is that the technological transformation of warfare may outpace the legal and ethical frameworks meant to protect human life.
From 23 to 27 March 2024, I attended the 148th Assembly of the Inter-Parliamentary Union (IPU) in Geneva as part of the Namibian parliamentary delegation.
The assembly serves as the IPU’s principal political platform where national parliaments deliberate on emerging global threats to peace, democracy, security and human dignity.
Among the most important debates was the rapid development of autonomous weapons and the growing integration of AI into military operations.
There is currently no universally agreed definition of lethal autonomous weapon systems.
However, at the 148th IPU assembly delegates adopted a formulation proposed by the International Committee of the Red Cross.
Under this formulation, autonomous weapon systems are weapons with autonomy in their critical functions, meaning systems capable of identifying, tracking and attacking targets without human intervention.
As armed conflicts continue to proliferate across the world, drawing in major powers and increasingly integrating AI-driven technologies, the debate over autonomous weapons has shifted from theoretical concern to urgent political reality.
The key question is no longer which side prevails in a conflict but whether humanity is prepared to surrender its moral conscience and judgement to machines, allowing algorithms, rather than human beings, to determine who lives and who dies.
Unlike traditional weapons, autonomous systems rely on AI to analyse data, identify targets and execute attacks with little or no human intervention.

The 148th IPU assembly acknowledged that systems with varying degrees of autonomy have already been deployed in active conflicts.
According to the Australian Human Rights Commission’s submission to the Human Rights Council Advisory Committee dated 30 November 2023, autonomous drones have been used in a “fire, forget and find” method, whereby a weapon, once launched, locates and strikes its target entirely on its own.
Such developments mark a profound shift in the relationship between power, violence and human responsibility.
International humanitarian law must therefore be fully applied to limit the brutality of war and preserve a measure of humanity in conflict zones.
Central to this framework are the principles of distinction, proportionality and accountability.
The principle of proportionality prohibits attacks expected to cause civilian harm excessive in relation to the concrete military advantage anticipated.
Such judgements require human reasoning.
Commanders must assess context, anticipate consequences and reassess decisions as battlefield conditions evolve. If machines are allowed to make life-and-death decisions without meaningful human control, determining responsibility when civilians are harmed becomes deeply problematic.
Human rights law reinforces these concerns.
The Universal Declaration of Human Rights, adopted on 10 December 1948, established the universality of human rights as a cornerstone of international law.
Building on this foundation, the international community adopted the International Covenant on Civil and Political Rights (ICCPR) and the International Covenant on Economic, Social and Cultural Rights in 1966, forming together the International Bill of Human Rights.
Article 6 of the ICCPR affirms the right to life as a supreme right from which no derogation is permitted.
Allowing autonomous systems to determine who lives and who dies is therefore fundamentally at odds with the foundations of international human rights law.
BIAS AND DISCRIMINATION
AI also raises serious concerns about bias and discrimination.
AI systems rely on data and predictive models that may reproduce the biases embedded in the societies that design them.
In warfare, such systems risk disproportionately affecting vulnerable groups, including persons with disabilities.
Individuals using assistive devices such as wheelchairs, walking sticks or hearing aids may move in ways automated systems misinterpret as suspicious.
Deaf persons may not hear warnings and blind persons may not see signals intended to alert civilians of danger. A machine trained on imperfect data may therefore fail to distinguish vulnerability from threat.
These concerns reflect a broader unease about new and emerging technologies in the military domain.
While such technologies promise strategic efficiency, they raise profound questions about control, accountability and human dignity.
Recognising this, the 148th IPU assembly called on parliaments worldwide to regulate the development and use of autonomous weapons and ensure that meaningful human control remains central to all decisions involving the use of force.
Namibia, as a committed member of the international community and a signatory to the core instruments of human rights law, cannot stand apart from this debate.
The National Assembly must ensure that technological innovation does not outpace democratic oversight or erode the legal norms designed to protect human life.
In line with the obligations recognised at the IPU assembly, Namibian parliamentarians should initiate formal debate on autonomous weapon systems and engage the United Nations on the development of a legally binding international instrument regulating lethal autonomous weapon systems.
The 2026 deadline set by the United Nations secretary general is not a distant horizon. It is the present moment and Namibia must not be found silent.
As Antonio Gramsci reminds us, the struggle for human dignity is ultimately a struggle to reclaim moral and political control over the systems that govern our lives.
If humanity is to retain authority over the technologies it creates, that struggle must begin in parliaments such as ours.
– Henny Seibeb is a former member of the National Assembly and represented the Namibian parliamentary delegation at the 148th Assembly of the Inter-Parliamentary Union in Geneva in March 2024.
In an age of information overload, Sunrise is The Namibian’s morning briefing, delivered at 6h00 from Monday to Friday. It offers a curated rundown of the most important stories from the past 24 hours – occasionally with a light, witty touch. It’s an essential way to stay informed. Subscribe and join our newsletter community.
The Namibian uses AI tools to assist with improved quality, accuracy and efficiency, while maintaining editorial oversight and journalistic integrity.
Stay informed with The Namibian – your source for credible journalism. Get in-depth reporting and opinions for
only N$85 a month. Invest in journalism, invest in democracy –
Subscribe Now!




