Machines carry out specific actions observe outcomes and adapt the behavior accordingly with the use of artificial intelligence. However, this process can get out of hands as artificial intelligence works with the intention of avoiding human intervention. The best solution is that artificial intelligence engineers need to ensure that machines be prevented from learning how to get around or avoid human commands. Researchers from EPFL are studying this issue and have found a way so that the control of these machines stays with human operators. This research can contribute extensively to the development of drones and autonomous vehicles, as it can help operate the vehicles or drones safely.
Google Deepmind researchers and the Future of Humanity Institute at Oxford University had last year ie in 2016 introduced a learning protocol which prevented machines using artificial intelligence to learn from interruptions and becoming uncontrollable. However, with artificial intelligence being extensively used in various other applications such as self-driving cars on the road or drones in the air, everything gets much complicated as machines start learning from each other as multiple machines are being used for a single application. This causes the machine to learn from how others are being interrupted as well as from how they are interrupted themselves.
This complexity is being resolved by EPFL researchers by means of safe interruptibility. This method allows humans to interrupt the learning process of artificial intelligence device processes and also ensures that these interruptions do not change the way the machines learn. In other words forgetting mechanisms are added to the learning algorithms which delete bits of memory from the machine so that machines are not impacted by any interruption. These researchers have worked on existing algorithms and proved that safe interruptibility works, irrespective of how complicated an artificial intelligence system is, irrespective of the number of robots used, as well as irrespective of the kind of interruptions too.