Loading...
The Financial Express

Artificial intelligence: Wolf in sheep clothing?


Artificial intelligence: Wolf in sheep clothing?

Artificial intelligence (AI) reflects a mixture of algorithms and intelligence. In essence it is intelligence both applied to mathematical equations and derived from stored formulas. Ever since John von Neumann played around with his 'parlour games' in the late 1920s, then, with Oskar Morgenstern postulated 'game theory' to help the US World War II air force get maximum bang with every bomb dropped at minimum costs (the well-known 'maximin' formula), much more mileage has been gotten out of numbers, at least when conveyed in 'digital' parlance, than in the thousands of years since the likes of Phythagoras were building theorems. The big difference has been the contraption utilised. Calculators helped accelerate problem-solving involving numbers, but it was only over time that we learned how repeated calculations actually converted into some kind of contraption 'intelligence': not only computing faster, but also remembering what it was doing was the key breakthrough from which the computer evolved. To make a long story short, along with the Third Industrial Revolution, which also began in the 1920s with the computer as its signature device, was also sown the seeds of the Fourth Industrial Revolution (IR 4.0), based on artificial intelligence. It, by deduction, neither pertains to 'industrial' (since only mathematicians were involved, both then and now), nor a 'revolution' (since it evolved over almost a century ago).

The point of it all is simply that artificial intelligence (alternately IR 4.0), is set to change the world, at least perceptually: we hear of automation everywhere, now even behind the automobile steering-wheel, through robots displaying many human qualities and functions, and so forth. True, some major transformation await in the wings, but these have been impacting life so incrementally from the time of von Neumann that searching for a 'revolution' will have to be in vain. When we trace cellular/mobile/smart-phone in our pocket today to the room-sized ENIAC contraption (Electronic Numerical Integrator and Computer) that von Neumann and others built in 1946, we can see the huge change. Yet, living from 1946 to the present, we have had to adjust so incrementally for so long (the mammoth ENIAC contraption becoming the personal computer before the pocket cell/mobile/smart-phone), that our sense of surprise gets neutered.

More telling is the AI mismatch in human society. The AI postulations of society are falling apart in reality. Monideepa Tarafdar, Cynthia M. Beath, and Jeanne W. Ross noisily circulated 5 (five) 'crucial capabilities' behind successful AI users ("Using AI to enhance business operations," MIT Sloan Management Review 60, no. 4, Summer 2019, 37-44). One does not have to be 'successful' to dig out the truth, but even becoming successful exposes the pitfalls blatantly: since digital is intellectual, it starkly differs from the physical inputs behind IR 1.0 and IR 2.0; and any cross-country university appraisal confirms students not only prefer the physical over the intellectual as far as their efforts go these days, but actually will also stretch their educational lifespan only as far as their physical limits allow, not necessarily intellectual. For example, too few a proportion of undergraduates become graduates, even though the 'pot of gold' lies in becoming so, and even among graduates, an even smaller proportion will seek out the levels needed to be able to crunch the numbers needed for free-flowing artificial intelligence. Ironically, many more prefer the high-salary outcome of intellectual input than the low-wage counterpart of physical input.

Those 5 'crucial capabilities' exposes why AI hollowness stands out more than its substance in today's society. They include: data science competency; business domain proficiency; enterprise architecture expertise; operational IT backbone; and digital inquisitiveness. These may add up to a wolf-sized bite (in jobs, reputation, credibility), but in reality function only as tamely as sheep (be stumped by other constraints, mostly social).

Data science competency belongs to a very small fraction of university students, and arguably diminishing in spite of IT (information technology) programs and universities proliferating: too many other professions have been dragging students away, or it could be that, with its booming business schools, Dhaka would have many more potential data science competent students: we do not, since we do not have a viable market, and if we do, we hire many white-collar professionals from abroad anyway (in the process also swallowing our hard-earned foreign exchange unnecessarily). A similar limitation plagues business domain proficiency: we may have many business graduates, but actually finding them in steady professional jobs in the same area as their training is getting harder to find. Wasted skills discourage student recruitment.

Even the subject of enterprise architecture expertise faces questions. For a start, enterprises across Bangladesh represent only a tiny proportion for the population size we have: only a handful will stand the test of time since many businesses, particularly the rules-defying RMG (ready-made garment) type, engage in practices that cannot be sustained, such as low-wages, child labour, dilapidated premises, no fire-escapes, and so forth. Taxation and corporate social responsibilities are more avoided than advocated by a large margin. Even though a 'digital Bangladesh' has some frugal form of an operational IT backbone, do we really have enough to scrape the frontiers of innovation, cutting-edge practices, and so forth?

This predicament also plagues the fifth 'crucial capability': digital inquisitiveness. That exercise is conducted by too few for too remote purposes to add up as an economic driver or social catalyst.

With the 'crucial capabilities' punctured, the AI domain is increasingly being littered with 'myths' than 'realities'. Mathan Furr and Andrew Shipilov identify a good number of them ("Digital doesn't have to be disruptive," Harvard Business Review 97, no. 4 July-August 2019, 95-103), indicating the prevalent sentiment of absorbing them for analysis than rejecting them for what they may be worth. Some of those 'myths' even raise eyebrows when aired in the first place: digitalisation disrupts the value nexus, or eliminates physical inputs, or buys start-ups, or implies technological growth, or even overhauls legacy systems. In reality, what is called the 'digitalisation revolution' has become Samuel Beckett's 'Godot': we keep awaiting it knocking on our door one day, or forcefully climbing in through the window, or even changing lifestyles so much so quickly that we might just as well stay wrapped up underneath our blanket in the safety of our bed.

We know from all around us how these are not at all happening, certainly not in full blossom to constitute a social 'revolution'. The digital world will need more humans, if not to crunch the numbers, then to lubricate the technologies, help new generations to learn about them, and make incremental advances constantly, if only to make them function, and so forth, to displace the human being or assert technologies upon them so overwhelmingly as to silence their hopes, skills, and capacities.

Wherever we look, we just do not see the AI army rolling down the mountains to devour the lifestyles we know and cherish. True these will change, but this will take so much time that we will have scope to prepare incrementally for them.

Dr Imtiaz A Hussain is Professor & Head of the Department of Global Studies & Governance at Independent University, Bangladesh.

[email protected]

 

Share if you like

Filter By Topic