Menu

Menu controller icon

Realising fully autonomous systems with AI

Posted: 13-04-2021

Tags:

    The term Artificial Intelligence or AI has been around for over 60 years, where it was first coined by John McCarthy in 1956. Back then, it was a term to describe how the emerging field of computers (calculators in the original work) could be used to automate jobs that a human can do. Today, we see AI in all aspects of our daily lives and in many cases, totally invisible to us. Take for example, the Lane Assist feature of most modern cars. This feature enables cars to identify when they are straying out of lane and can take corrective actions to bring the car back into a safe position. Cameras embedded in the front of the car can perceive the white or yellow lines on the road. Using this information the car’s AI can relay corrective actions back to the steering control. This example exhibits two astounding advancements in the field of AI. Firstly, that this type of perception can be extracted efficiently from cameras, and secondly that complex decisions previously reserved for humans has been handed over to a machine. In this case, the car’s AI is taking control of the operation of the car.

    Figure 1: Lane Assist (VW)

    We can imagine this simple example is describing the first instances of an autonomous system. In the future, fully autonomous cars like those being developed by Waymo, Cruise and Argo AI, will take over the full operation of cars including decisions previously reserved for humans. You will climb into your car, enter your pin to unlock and punch in your destination. After a few seconds, the car itself will begin to move autonomously and continue doing so until it reaches your destination. Everything from stopping at traffic lights, avoiding cyclists, changing lane, exiting roundabouts and ultimately parking.

    Figure 2: Self Driving Car

    However, we are not quite there just yet. Replacing an experienced human in making split second decisions is difficult. Autonomous driving is our current day moon landing, our space race. It is a vision which holds as much excitement for the future as it does opportunity. The autonomous driving market is expected to reach 3 trillion dollars by 2030. For this reason billions of dollars are being invested in such projects every year. The key hindrance holding back fully autonomous driving, and other fully autonomous systems are the many still the field of AI in terms of accuracy, efficiency, power consumption, accountability and trust to name a few, and as a consequence we all benefit from these new discoveries.

    One of these recent discoveries involves leaps forward in our ability to do image classification. Image classification is a task that involves having an AI algorithm look at an image and come up with a label or even a description of what is in the image. Every year better and better approaches are developed that compete on tremendously large image sets, each hoping to beat their competitors and improve their prediction accuracy to 100%. This competition is held at the European Conference on Computer Vision, called the Joint COCO and LVIS Recognition Challenge. Scientists from all over the world compete to see if their AI algorithms can accurately label one of 330k images. Although these competitions bring kudos and recognition to the winning research teams that get involved, a major bi-product is the discovery of new and more efficient AI algorithms. These algorithms can be re-used by any reasonably keen software engineer or researcher hoping to apply new techniques to solve real world problems.

    autonomous driving
    Figure 3: Image analysis with AI

    One such example is that of the H2020 CYBELE project. CYBELE looks to apply new AI techniques to precision agriculture and precision livestock farming. In that project the University of Copenhagen is using state of the art AI algorithms to monitor the health and wellbeing of pigs in large pig farms. They do this by having a camera pointing at a pig pen which is designed to be able to count pigs, estimate their weights and identify undesirable behaviour patterns. All these metrics are calculated from live video and processed without human intervention. These metrics can be communicated to a farm manager who can make more informed pig production decisions. In the example above AI was used to help inform humans on how to make better decisions. However, to realise fully autonomous systems two aspects of the decision needs to be handed over to the machine. As in the autonomous driving case, the control of the system needs to be handed over to AI and likewise the decision making needs to be handed over too. All systems that involve humans in the control and decision making can make their own journey towards becoming fully autonomous, and therefore reap the rewards.

    At the Programmable Autonomous Systems Division (PAS) at Walton Institute, we are experts in AI, edge computing and fog data analytics. We look at opportunities to progress business processes and verticals along the fully autonomous journey to unlock value and opportunities. We have successfully applied our expertise into Smart Grid management, Precision Agriculture and many other verticals. Our approach aims to see which aspects of control or decision support can be automated and apply state of the art technologies. Check out our current projects to learn more, and stay tuned to hear more about the future of autonomous systems. Contact Head of Division, Dr Steven Davy for further details.


    http://www-formal.stanford.edu/jmc/history/dartmouth/dartmouth.html

    https://www.researchandmarkets.com/reports/5230068/autonomous-cars-global-market-opportunities-and

    https://cocodataset.org/workshop/coco-lvis-eccv-2020.html

    Dan Børge Jensen,Mona Lillian Vestbjerg, Larsen Lene Juul Pedersen, “Predicting pen fouling in fattening pigs from pig position”, Livestock Science Volume 231, January 2020, 103852