‘So, how do we share our meal today ?’ the lion asked. The jackal turned his face towards the donkey, who (good as school math, as he were) promptly replied, ‘Divide by three, Sir,’ ensuring an equal share. For evident reasons the king of the jungle got enraged, and did not spare him. The jackal understood it was his turn, as there was no one else other than him and the Supreme. ‘As you wish, my Lord,’ he thoughtfully replied. The lion— pleased with his reinstated supremacy, took his share (what we call the lion’s share), while the leftover was enough to satiate the jackal’s appetite and fulfill his expectation.
The jackal is portrayed as the cunning creature of the woods in most of Æsop’s fables— as he understands situations better than anybody else, and gives the best (at least a readily acceptable) solution to any problem. In the sharing dilemma, there was possibly no better alternative for the jackal to propose, as anything short of the lion’s share would have been vehemently rejected and the solver annihilated like the donkey. The jackal had apparently learnt a vital lesson from the donkey’s event (as he normally does), and decided to put an extreme weight on the share of the king— massively improving on the simple ⅓rd weight found from elementary math.

Learning from past events, the brain has developed the faculty not only to remember, but also to reason… as to when which solution will work the best, recognize images or sounds and analyze them with some specific objective. Science, perhaps not anthropocentrically, has found the human brain to be the most developed in this respect. Mimicking this faculty of the human brain, Neural Networks (and now its advanced version Deep Learning) can find patterns in data and reason from experience how to categorize facial expressions, detect cancer or predict stock prices. Appropriate feedback on its predictions improve the level of intelligence through constant practice, as in life long learning.
Individual biases or personal limitations can sometimes deteriorate results, a reason why Data Science is relying more and more on ‘collective intelligence,’ a concept borrowed from nature. Birds fly in flocks, following a leader who shows the best way, while ants communicate among themselves by leaving a trail of pheromone along the path… leading to the nearest food source. Of course there are individual birds or ants who choose to explore newer avenues (rather than blindly follow the conventional path taken by the majority) and this is how the community’s search for better and newer sources of food continues.
In a colony of bees, specific scout bees are designated for this purpose. Their primary job is not collection of food, but finding new (if possible better) sources. Performing a jubilant dance in front of the beehive instigates other bees to follow the scout’s trail. Thus search continues unabated, triggering development of their living standards as newer and richer food sources are found.
The springing up of better and dearer solutions are perhaps the most marked in the class of Evolutionary Algorithms. Following the laws of genetics, two good (parent) solutions are selected, from which a third (child) solution is born by mixing and matching features of both parents with an eclectic approach. Though it is not always guaranteed that a better solution will evolve in this way (in fact it does not, always) but compared to picking a random solution blindly, this has a better chance of improvement.
This principle of communicating and partially following a successful leader with a hint of independence to explore new ways have been taken up by computer (or data) scientists to devise a bunch of metaheuristics called Particle Swarm or Ant Colony Optimizations, Artificial Bee Colony or Genetic Algorithm— providing general methodologies to arrive at successful solutions to many different kinds of complex real life problems.
Based on a diverse range of such heuristic approaches, Artificial Intelligence (AI) and Machine Learning (ML) constitute two majorly overlapping sets of tools and techniques responsible for the functioning of many gadgets and devices, the working of many organizations and disciplines. Their application range from electric grids to smart cameras, weather forecasting to medical diagnosis, working out business mergers to suggesting travel plans. What we decide from our human notions built on past experience are now transferred to computers or microcontrollers— not in the form of definite instructions (which was the essence of classical computing in the pre-AI era), but as training on how to handle unknown situations— so that the computer, now laden with humane intelligence, can decide by itself when to perform a certain action and when not, what decision to take in which circumstance and when to avoid or abstain from doing certain things.
This has purportedly made the computer quite independent of humans— it is argued. Further, gathering the intelligence of a diverse spectrum of natural, social or mathematical phenomena, the computer could soon emerge as a powerful entity and even overpower its human creator ! There is still no unanimous answer to this controversy on the possibility of future computers ruling the human race, presumably as eminent scientists and long-standing friends Stephen Hawking and Roger Penrose differ in their opinion to this question.
Perhaps it only remains to be seen who wins the nail-biting match— the natural creator or its brainchild, the ‘artificially intelligent’ computer.

Author: Partha Dey (CXsphere)
Picture Credit: Chandan Sabar

Leave a Reply

Your email address will not be published. Required fields are marked *