ǿմý

مركز جامعة خليفة لأنظمة الروبوتات المستقلة

Research


KUCARS research is focused on complementary themes, each hosting two broad and discrete projects.

These themes and projects address some of the frontier robotics challenges. They will address, in an integrative manner, some of the key technology challenges in robotics, including robust autonomy in dynamic and unstructured environments, sensing and artificial perception, embodied artificial intelligence, sense and avoid, GPS denied navigation, vision based control and 3D tracking, high disturbance rejection control, smart design, and human-robot interactions.


Theme-1: Autonomous Vehicles Lab (AVLab)

AV-lab is part ofKU Center for Autonomous Robotic Systems (KUCARS)focused on advancing autonomous and robotic technologies through four main themes and nine integrated projects over three years. The themes include Autonomous Cars, Aerial Robotics, Marine Robotics, and Industrial Robotics.

Specific projects cover situational awareness and navigation in autonomous cars, vision-based collaborative inspection using UGVs and UAVs, surveillance with aerial vehicles, critical infrastructure inspection and ocean interventions with marine robots, and robotics for autonomous greenhouse farming. This comprehensive structure aims to address diverse challenges in autonomous systems and robotics, particularly in extreme and critical environments.

Our research atAutonomous Vehicle Lab (AV-Lab) focuses on advancing autonomous vehicle technology with an emphasis on safe operation, seamless integration into smart urban ecosystems, and aligning AI systems with passenger needs.

Our key research questions include:

  • What strategies can be adopted to guarantee safety within the vital components of the autonomous vehicle’s decision-making process?
  • How can multi-agent autonomous vehicles effectively share sensory data and decision-making information, along with inherent uncertainties, by utilizing Vehicle-to-Everything (V2X) technology?
  • How can we design decision-making strategies that offer robust safety assurances through rigorous theoretical validation?

 

 

List of Projects from AV lab:

  1. MSAP: Multi-robot Symbiotic Autonomy Platform for Next-generation Cities and Smart Communities

 

PROJECT TEAM

KU Principal Investigator

Name, Faculty Rank, Dept.

Khaled ElBassioni, Full Professor, Department of EECS

KU Co-Investigator(s)

Name, Faculty Rank, Dept.

Majid Khonji, Assistant Professor, Department of EECS

In this project, reasoning under both aleatoric and epistemic uncertainty, we seek to design a proof-of-concept Multi-robot Symbiotic Autonomy Platform (MSAP) and synthesize provably efficient controllers and algorithms for safety-critical applications in transportation and last-mile logistics. Existing abstraction methods fail to achieve this novel and general objective. We target to deploy the developed MSAP at Khalifa University’s (KU) SAN campus, thereby positioning KU as the first university in the region with a next-generation Intelligent Campus. Specifically, capitalizing on the infrastructure and equipment already in place at KU’s Autonomous Vehicle Lab (AV Lab), which appear in Figs. 1 and 2, and include autonomous shuttle service at the SAN campus, contactless robotic delivery vehicle, retrofitted autonomous Nissan Leaf car, legged robot, robotic arm and four drones, we seek to develop an MSAP operating system that allows users to avail the following use cases:

  1. Risk-aware multi-agent navigation (through collective autonomous intelligence and collective risk assessment);
  2. Autonomous (small) package delivery under epistemic and aleatoric uncertainty;
  3. User-centered passenger mobility service via preference elicitation integrated into vehicle navigation stack;
  4. Integrated multi-agent perception that captures objects beyond line-of-sight of a single agent[1];
  5. Smart Monitoring and Patrolling system for outdoor facilities (e.g. a parking area, outdoor storage area, construction site etc.);
  6. An application programming interface (API), hosted on AV Lab servers[2], for MSAP OS that enables future app development on the integrated platform.

[1] This part is an extension of deliverables of KKJRC-2019-Trans1 project, which involves infrastructure perception

[2]

 

  1. Hierarchical Reasoning for Enhanced Integrated Autonomy: Addressing Uncertainty in Heterogeneous Robotic Systems for Logistics and Agriculture Applications – HARBOT

Khalifa University and Belgrade University Collaboration

Project Title:

Hierarchical Reasoning for Enhanced Integrated Autonomy: Addressing Uncertainty in Heterogeneous Robotic Systems for Logistics and Agriculture Applications – HARBOT

Khalifa University Faculty

PI: Dr. Majid Khonji (majid.khonji@ku.ac.ae)

Co-I: Prof. Jorge Dias (jorge.dias@ku.ac.ae)

Co-I: Prof. Lakmal Senevirante (lakmal.seneviratne@ku.ac.ae)

University of Belgrade

Faculty

PI: Assc. Prof. Kosta Jovanovic (kostaj@etf.rs)

Co-I: Dr. Maja Trumic (maja.trumic@etf.rs)

Co-I: Dr. Branko Lukic (branko@etf.rs)

Keywords/ Key Phrases (up to 5; be specific)

Autonomous Legged Robots; Agriculture; Logistics; Compliant Manipulation; Uncertainty-aware Reasoning

The successful integration of autonomous systems in real-world environments, such as last-mile logistics and agriculture, requires effectively handling uncertainties and risks. This research proposal emphasizes the development of a novel technique for integrated autonomy, with a focus on hierarchical reasoning, uncertainty-aware planning, and compliant manipulation and material handling in autonomous legged robots integrated with autonomous vehicle platforms. Wheeled robots, although advantageous in many applications, have some inherent disadvantages when it comes to operating in challenging terrains or delicate environments. Some of these drawbacks include limited mobility over rough terrain, and struggling to traverse uneven surfaces, such as rubble, steep or loose ground. Moreover, they trample continuous strips of land, thereby reducing the viable crop area in agricultural applications. Legged robots can overcome these limitations, making them suitable for the targeted applications in this proposal. A key aspect of the proposed approach is leveraging prediction and perception subsystems that not only provide scene understanding but also incorporate a degree of doubt or suspicion in their outputs. Unlike traditional systems that assume output accuracy, the project requires these subsystems to explicitly represent uncertainty, enabling the reasoning module to make well-informed decisions. Additionally, the integration of the legged robot with an autonomous vehicle will offer benefits such as charging capabilities, acting as a depot for logistics, and providing storage for plants or crop samples in agriculture. Finally, last-mile logistics comprises safe and efficient object manipulation that should be ensured through the compliance on hardware or software (control) level. To that end the project will combine expertise and cutting-edge infrastructure in autonomous mobile robots and sensing at Khalifa University (KU) and end remarkable research track of the University of Belgrade (UB) team in collaborative robot control and applications. The main objectives of this research are:

  • Develop a hierarchical reasoning framework for integrated autonomy in autonomous legged robots and vehicle platforms, optimized for last-mile logistics and agriculture applications.
  • Investigate techniques for distilling uncertainty from sensing, prediction, and perception modalities relevant to the targeted applications.
  • Design an uncertainty-aware reasoning mechanism that respects risk thresholds and quality of service constraints, considering uncertainty information from prediction and perception subsystems.
  • Explore the benefits of integrating the legged robot with an autonomous vehicle for charging, logistics depot functionality, and agricultural storage purposes.
  • Develop a novel strategy for dynamic manipulation of objects to make compliant robot’s object handling more time and energy efficient.

Exploit the principal advantages of cutting-edge compliant robot technology, including the use of soft end-effectors for precise catching of various types of objects and the capability of storing/releasing energy in compliant joints to achieve efficient throwing strategies.


Figure 1: Proposed high-level system Architecture

Figure 2: Left: Available at AV Lab, Right: Boston Dynamic Spot Robot inspecting crops.

List of Projects from VSAP lab:

  1. INTEL NEUROMORPHIC RESEARCH: Embodied Neuromorphic AI for Robotic Perception

Industrial Partner: INTEL

Project Team: PI; Jorge Dias, VSAP Lab members

The design of robots that interact autonomously with the environment and exhibit complex behaviors is an open challenge that can benefit from understanding what makes living beings fit to act in the world. Neuromorphic engineering studies neural computational principles to develop technologies that can provide a computing substrate for building compact and low-power processing systems. In this project, we aim to demonstrate why endowing robots with neuromorphic technologies – from perception to motor control – represents a promising approach for creating robots that can seamlessly integrate in society.

  1. MARVEL: Maritime Advanced Robotics, Vision, Efficiency, and Learning. (DSUF Project)

Project Team:

KU Faculty Principal Investigator:
(Name, Prof. Rank, Department)

Jorge Dias, Professor of Electrical Engineering

 

KU Faculty Co-Investigator(s):
(Name, Prof. Rank, Department)

Federico Renda, Associate Professor, Mechanical & Nuclear Engineering

Sajid Javed, Assistant Professor, Computer Science

Naoufel Werghi, Professor, Computer Science

 

This DSUF proposal aims to revolutionize underwater maritime monitoring, developing the MARVEL framework: Maritime Advanced Robotics, Vision, Efficiency, and Learning. MARVEL integrates underwater multi-robot autonomous systems (underwater swarm) with visual language models (VLM) and multimodal AI to allow wide areas of monitoring in the underwater environment. In particular, the collaboration among many autonomous underwater vehicles (AUVs) enables the assessment of water quality and environmental health, producing 3D mapping of main physical and chemical environmental parameters. Advanced computer vision models allow online and offline observation of marine life, particularly coral reefs, and man-made structures like aquaculture farms, oil and gas facilities, and ports.

Given the presence of multiple heterogenous autonomous vehicles equipped with onboard intelligence and advanced vision, MARVEL enhances maritime monitoring systems’ accuracy, efficiency, and reliability, mitigating issues such as low visibility and image distortion. Within the MARVEL framework, we develop a unified approach that integrates the observations and measurements from many underwater vehicles leading to multi-modal 3D mappings, consisting of multiple modalities of language (text) and vision (image) simultaneously, to perform advanced intelligent vision-language tasks, such as Visual Question Answering (VQA), image captioning, and Text-to-Image search. This also allows the realization of real-time smart alarm systems in case any parameter is out of range or any danger is detected. MARVEL enables continuous operation, low-latency data processing, and adaptive resource allocation in power-constrained underwater environments by improving energy efficiency and distributed recharging systems. Integrating robotic swarms into MARVEL enhances monitoring capabilities, offering robustness, efficiency, advanced navigation and mapping, and improved sensing and perception. This interdisciplinary approach aims to advance the state-of-the-art in underwater monitoring, ultimately enhancing safety and security in maritime environments.

  1. KURA

Project Team: Dr. Hamad Karki, Dr. Giulia De Masi

At RoboKUp, we’re here to prove that robots and soccer aren’t just a ‘far-fetched’ idea. Envision a world where soccer and robots ‘team up’ for a game-changing experience. Our vision is to ‘kickstart’ this future, sparking a passion for soccer and tech that’s ‘off the charts.’ Together, we’re ‘building’ a new playbook, where the ‘ball’ is in the hands of both humans and robots. So, ‘gear up’ for a fantastic journey and be part of a future where robots and soccer come together for the ultimate ‘goal’!”

  1. Neuromorphic Hardware

Project Team: PI: Jorge Dias (KU). Co-Is: Muhammed Shafique (NYU), Fakhreddine Zayer (KU).

Industrial Collaborator: INTEL

In our ongoing efforts to advance cutting-edge research in neuromorphic hardware, we have initiated a collaborative project between KU and NYU. This partnership aims to leverage the expertise and resources of both institutions to explore innovative solutions and applications in neuromorphic computing. Neuromorphic hardware, which mimics the neural architecture and functioning of the human brain, holds significant potential for developing efficient and powerful computing systems. This project focuses on several key areas:

Development of Neuromorphic Chips: Designing and fabricating neuromorphic chips that can emulate the synaptic and neuronal behaviors observed in biological systems. These chips are expected to be highly efficient in terms of power consumption and processing speed, making them suitable for various applications.

Algorithm Optimization: Creating and refining algorithms that are optimized for neuromorphic hardware. This involves adapting existing machine learning and artificial intelligence models to take full advantage of the unique architecture and capabilities of neuromorphic systems.

Applications in AI and Robotics: Exploring the application of neuromorphic hardware in artificial intelligence and robotics. This includes developing intelligent systems that can perform complex tasks such as pattern recognition, decision-making, and autonomous navigation with greater efficiency and adaptability.

Interdisciplinary Research: Fostering interdisciplinary research that combines insights from neuroscience, computer science, and electrical engineering. This collaborative approach aims to push the boundaries of what neuromorphic hardware can achieve by integrating knowledge from multiple domains.

  1. Erasmus + UNICAS_Khalifa

Project Team: PI: Jorge Dias (KU). Co-Is: Fedor (NYU), Fakhreddine Zayer (KU).

Name of the institution (and department where relevant)

Erasmus code or city

Contact details (email, phone)

Websites

UNIVERSITÀ DEGLI STUDI DI CASSINO E DEL LAZIO MERIDINALE

I CASSINO 01

Institutional Coordinator:

Prof. Nisticò Sergio

s.nistico@unicas.it

Departmental Coordinator

Prof. Emanuele Grossi

e.grossi@unicas.it

Contact Teacher

Prof. Antonio Maffucci

a.maffucci@unicas.it

International Officer

Dott. Tamara Patriarca

Erasmus Office

Viale dell’Università – Rettorato, Loc. Folcara, 03043 Cassino (FR) Italia

Tel: +39 0776 2994505

E-mail: t.patriarca@unicas.it

General: https://www.unicas.it

Faculty/faculties: https://www.unicas.it/didattica/corsi-di-studio.aspx

Course catalogue: https://www.unicas.it/didattica/corsi-di-studio.aspx

Khalifa University of Science and Technology

Abu Dhabi

Institutional Coordinator

Prof. Jorge Manuel Miranda Dias

jorge.dias@ku.ac.ae

Contact Teachers

Prof. Jorge Manuel Miranda Dias

jorge.dias@ku.ac.ae

Prof. Fedor Vasilievich Kusmartsev

fedor.kusmartsev@ku.ac.ae

Fakhreddine Zayer

Fakhreddine.zayer@ku.ac.ae

 

P O Box 127788, Abu Dhabi, UAE

 

General: /

 

 

 

As part of our theme activities, we created an Erasmus+ mobility program between the University of Cassino and Southern Lazio (UNICAS) in Italy and Khalifa University. This initiative fosters international collaboration and academic exchange between Europe and the Middle East.

The Erasmus+ mobility program will enable students, faculty, and staff from both institutions to engage in short-term study, teaching, and training opportunities. This program enhances the educational experience, promotes cultural understanding, and develops strong academic and professional networks between participating institutions.

Key objectives of the program include:

  • Facilitating the exchange of knowledge and best practices in various academic fields.
  • Promoting cross-cultural dialogue and mutual understanding between students and staff from different regions.
  • Providing opportunities for professional development and skill enhancement through international exposure.

Strengthening institutional partnerships and collaborative research efforts between UNICAS and Khalifa University.


Theme-2: Unmanned Aerial Vehicles

Aims:

The overall aimof theme 2 is to improve drone autonomy in real-world applications, including, but not limited to, surveillance, inspection, and agriculture. The research goal is to establish a synergistic framework integratingadvanced drone technologies,computervision algorithms and applied AI.

Projects

1- Heterogeneous UGV-UAV Vision-based Control for Collaborative Inspection and 3D Reconstruction of Critical Infrastructure

Potential Collaborator: Emirates Nuclear Energy Corporation

Project Team

PI: Yahya Zweiri (PI). Co-Is: Naoufel Werghi, Lakmal Seneviratne, Igor Boiko, Rafic Ajaj.

This project investigates autonomous aerial drone-based inspection of critical infrastructure and indoor farms, such as concrete containment at nuclear power plants, railway tracks, oil and gas pipelines and greenhouses. Regular inspection and early problem detection of critical facilities allow for more safety and efficiency.

 

2- Aerial Drones for Precipitation Enhancement Through Cloud Measurements and Seeding

Industrial Collaborator: National Centre of Metrology

Project Team

PI: Yahya Zweiri (PI). Co-Is: Naoufel Werghi, Lakmal Seneviratne, Igor Boiko, Rafic Ajaj.

Unmanned Aerial Vehicles (UAVs) can play a crucial role in precipitation enhancement by helping in the identification of cloud regions suitable for seeding and performing cloud seeding.This framework aims at establishing testing, validation, and enhancement frameworks for assessing the performance of UAVs for cloud seeding in the United Arab Emirates context. It will also investigate innovative vision-based paradigms for could seeding. This project is conducted in collaboration with the National Center of Meteorology (NCM) in the UAE.

 

3- Surveillance and Inspection in Uniform Appearance Scenes Using Aerial Vehicles

Industrial Collaborators

  • Silal
  • Abu Dhabi Policy

Project Team

PI: Naoufel Werghi. Co-Is: Yahya Zweiri, Lakmal Seneviratne, Igor Boiko, Rafic Ajaj.

This project aims to provide aerial drone-based solutions for surveillance and inspection, for security and agriculture applications. More specifically, research efforts will be directed towards developing solutions for two specific problems 1) Monitoring uniform appearance crowds and scenes and 2) Regular inspection and early detection of plant disease

Start-up Company: Drone Leaf:

Website:


Theme 3 Projects

Marine Robotics for Ocean Interventions (Sponsored by KU, in collaboration with Stanford University, USA)

PI: Federico Renda

Co-I: Irfan Hussain

The oceans present critical sources of food and other resources crucial to economic development in the UAE and the world in general. Several offshore and subsea systems have been constructed to take advantage of the resources offered by the oceans. Such systems include energy harvesters, offshore oil and gas drilling platforms, communication cable networks, and fish farms. Constructing, maintaining, and repairing such offshore facilities is very challenging, requiring specialized vessels and human divers. This project will investigate the use of an underwater mobile manipulator for carrying out maintenance and repair tasks. We will focus on embracing compliance and torque control, developing novel compliant underwater arms, and multimodal underwater perception systems. Particular emphasis will be on semi-autonomous underwater rope manipulation and knotting.

 

Heterogeneous swarm of Underwater Autonomous Vehicles (Sponsored by TII, in collaboration with TII)

PI: Federico Renda

In this Research Project a swarm of heterogeneous underwater robotic fishes will be investigated. Starting from the biological counterparts of real school fishes, an artificial school of 30 hybrid (remotely controlled and autonomous) underwater robots will be implemented, consisting of 5 “special” fishes integrated with more sensors and communication channels with the remote operator, and 25 “normal” autonomous fishes, with less sensors and able to communicate each other (a safety methodology to recover the robot is integrated in case of emergency). Additional to that, a floating beacon is able to collect data from fishes. A static robot deployed on the sea floor and able to resurface after operation is designed to collect environmental data.

 

Artificial Feather Stars: A Multi-functional, Multi-agent and Hyper-redundant Approach for Soft Underwater Robotics (Sponsored by ONRG)

PI: Federico Renda

Underwater soft robotics is receiving growing popularity within the scientific community, thanks to its prospective capability of tackling challenges that are hardly dealt with by traditional rigid technologies, especially while interacting with an unstructured environment. Incorporating the benefits of the two approaches, in this project, we propose a novel underwater soft robot inspired by feather stars, a swimming specimen of the Crinoidea family, composed of multiple, branching soft modules that surround the main body of the animal as well as its robotics counterpart. Exploiting its elongated structure and natural compliance, each module will combine propulsion and manipulation skills in order to obtain a highly redundant system able to adapt to different tasks. The proposed design has the potential to unveil an effective solution for a broad range of underwater operations currently unsolved. It also allows safe, robust, and gentle manipulation and intervention in human-made underwater structures such as oil and gas pipelines.

 

Biomimetic-joint-thrust system for underwater propulsion (Sponsored by KU, in collaboration with The University of Edinburgh)

PI: Imran Afgan

Co-Is: Federico Renda, Vladimir Parezanovic, Sajid Javed

Jet propulsion is an energy-efficient mechanism employed in traditional aerospace and marine engines, consisting of a fast-moving jet of fluid to generate a propulsive thrust. This mechanism is also at the base of the swimming strategies of several marine species. By pointing the jet outlet in different directions and by changing the amount of water drawn, cephalopods (squid or octopi) can modify the direction and speed of their jet propulsion. Inspired by this, we present a soft jet propulsor that can control the outlet position and orientation. The design combines the actuation required for the volume squeezing at the base of the jet propulsion mechanism with a second actuator to define the orientation of the propulsor. This thruster has the potential to unveil an effective solution for a broad range of tasks currently unsolved, combining the high-speed maneuverability of traditional rigid jet propellers with the main advantages of soft robotics.

 

MBZIRC 2023: The Mohamed Bin Zayed International Robotics Challenge (Sponsored by Khalifa University/ASPIRE, in collaboration with Beijing Institute of Technology (BIT))

KU Team Lead: Irfan Hussain

The MBZIRC Maritime Grand Challenge is focused on deploying robot technology for ensuring maritime security. The challenge involves a heterogeneous fleet of autonomous aerial and surface vehicles collaborating in a relatively large GNSS-denied environment along the coast of Abu Dhabi, to detect predefined targets and to retrieve predefined objects from the targets. This is a highly complex problem and the key technical challenges include:

  • Real world robot autonomy in a large, complex environment
  • GPS-denied robot navigation
  • Robot collaboration for wide area cooperative search
  • Detection of relatively small targets in a relatively large area
  • UAV interactions with the environment – grasping and manipulations.
  • USV interactions with the environment – grasping and manipulations.

We will use a robotic system consists of 20 UAVs and 1 USV (with robotic arm) to solve the MBZIRC 2024 Maritime Grand Challenge tasks. These will include several inspection UAVs (up to 10) to search the 10 square km area, in a formation, to detect the targets. One or more inspection UAV will also approach the target vessels to completes the 3D modeling. Once the targets are identified, the USV will navigate to the target vessel. A transportation UAV will take off from the deck of the USV to pick up the light object(s) on the target vessel using airborne vision and will return back to the USV deck to place the object(s). Two transportation UAVs will be used to manipulate the heavy object to the edge of the target vessel. The USV robot manipulator will then pick up the heavy object and place on the deck.

Autonomous Underwater Robotic System for Aquaculture Applications (Sponsored by Khalifa university)

PI: Lakmal Seneviratne

Co-I: Irfan Hussain

UAE aims to meet its fish demand through expanding aquaculture infrastructure. Fish are cultured in sea-enclosures called fish cages/net pens, requiring regular monitoring for their well-being and cage maintenance. Current maintenance involves costly divers and multiple remotely operated vehicles (ROVs). Our project seeks to create an autonomous underwater robotic system for efficient aquafarm monitoring. We’ll utilize advanced computer vision with ROVs to detect fish net defects and assess fish health. This innovative approach aims to enhance the sustainability and profitability of UAE’s aquaculture industry.

Autonomous Robotic Systems for Port Inspection (Sponsored by Chinese Ministry of Science and Technology, in collaboration with Beijing Institute of Technology (BIT))

PI: Irfan Hussain

This project aims to enhance port security through an autonomous inspection and surveillance system consisting of a USV (Unmanned Surface Vehicle) and multiple UAVs (Unmanned Aerial Vehicles). The USV focuses on water surface surveillance, while UAVs monitor airspace around the USV. In detecting unusual activities, the system alerts a base station. Key hardware includes sensory-equipped USVs and UAVs with cameras, laser range finders, and grippers. Software advancements involve deep learning for activity identification, path planning, collision avoidance, and autonomous landing/grasping techniques. The design utilizes dynamic simulators like Gazebo and the Robot Operation System (ROS), with later real-world hardware integration.

Marine Robotics for Ocean Inspection (Sponsored by Khalifa University, in collaboration with Stanford University)

PI: Irfan Hussain

Co-I: Federico Renda

Our oceans offer immense opportunities yet face sustainability challenges due to environmental degradation. KUCARS Theme 3 focuses on robotic technologies for deep-sea monitoring and interventions, addressing complexities like currents, tides, and limited visibility. Underwater challenges include lack of GPS and difficulty in mapping using SLAM-based localization. Theme 3 aims to tackle these marine robotics research issues. Key objectives include: (O1) AI-driven navigation for underwater robots, (O2) AI-based marine environment mapping, (O3) multi-sensory active perception underwater, and (O4) algorithms for underwater grasping and manipulation.

Khalifa University-KUCARS and Stanford University-SRL: Marine Robotics Research Collaboration

 

Our oceans represent both a vast wealth of opportunity and an immediate challenge in sustainability. As we seek to exploit resources the oceans offer, we are met with environmental factors that are jeopardizing much of its ecosystem. This colloboration looks at the fact that human efforts to capitalize on these resources must do two things: develop means for humans to perform at depths the sorts of activities we so easily do on land – build, observe, repair; and correct and stabilize the underwater environment. Khalifa University and Stanford Univerity propose working together on these issues to develop marine robotic technologies that enable operations at depth. Our collaboration will focus on investigating: (i) Marine Robotics for Environment Monitoring, Critical Infrastructure Inspection, and Interventions including Maintenance Operations; and (ii) Human-Robot Interaction capabilities incorporating Haptic Interfaces with Embodied and Computational Intelligence.

Autonomous Coral Reef Inspection (Sponsored by ENEC)

PI: Lakmal Seneviratne

Co-I: Irfan Hussain

Coral reefs are an important indicator of marine ecology health and are under unprecedented threats due to human activities including global warming, pollution, and damage from human activity. However, traditional coral reef monitoring relies on human observers and is technically difficult requiring human divers to operate in challenging environments. The fast-emerging field of marine robotics has the potential for high impact in monitoring reef ecology with better spatial and temporal ranges and resolutions. Current underwater robotics and sensor technology is still in its infancy and has limited ability to monitor reefs with desired spatial and temporal ranges and resolutions. In this project we are extending marine robot capabilities to monitor reefs.


Theme 4 Projects

Agri Robotics for greenhouse farming (Funded by Khalifa University)

PI: Lakmal Seneviratne

Co-I: Irfan Hussain

This Theme will investigate problems related to industrial robotics, focusing on AI driven mobile grasping and manipulations. The theme will particularly focus on the application of robotics in agriculture. Robotics has the potential to revolutionize agriculture by performing a variety of tasks, including planting, inspection, harvesting, and weeding. Modern robots perform a wide range of manipulation tasks with high precision, speed, and efficiency. However, there are still many open research challenges in robot manipulations, before they can be deployed autonomously in unstructured environments. Advances in AI are helping to address these challenges and enable the development of more intelligent robots for a wider range of applications, such as agriculture, conservation and space exploration.

VRI: AI driven Robotics for Greenhouse and indoor farming (Funded by ASPIRE)

PI: Lakmal Seneviratne

Co-I: Irfan Hussain

The agriculture sector confronts challenges from population growth, climate change, and environmental degradation, amplified by excessive pesticide use. Historically, industrial revolutions reshaped agricultural practices, with the fourth revolution (4AR) now utilizing Robotics, AI, and IoT for tasks like indoor farming. This project focuses on AI-based localization and robot-assisted harvesting, with technology addressing complexities like varying crop maturity and environmental factors. Central to this is real-time monitoring, optimizing crop growth models specific to local conditions, and advancing robotic systems for indoor farming. Our aim: a high-quality, cost-effective agricultural solution that prioritizes sustainability and leverages technology to maximize yield and efficiency.

 

Compliant Knee Exoskeleton for Rehabilitation and Assistance of post-stroke hemiplegic patients (Sponsored by ARIC/Mubadala)

PI: Irfan Hussain

Stroke survivors often struggle with mobility, especially in the knee joint, a vulnerability affecting around 25% of adults, limiting function and life quality. Traditional rehabilitation involves repetitive limb movements assisted by therapists, demanding consistency and labor intensity. Robotic devices like Exoskeletons offer rehabilitation advantages, but face portability and actuation challenges. We introduce a compliant knee exoskeleton with stiffness modulation resembling the human knee during gait, offering effective patient assistance.

User-Defined Sixth Finger: Designing A Robotic Assistive Device to Support Stroke Patients (Sponsored by Swedish Government under STINT, in collaboration with Chalmers University of Technology)

KU PI: Irfan Hussain

After a stroke, many face challenges with hand and finger movements, impeding basic tasks like holding objects due to conditions like hemiplegia. To aid these individuals, supernumerary robotic limbs have been developed to either augment the functioning limb or provide additional support for the impaired one. This research is twofold: developing the assistive technology and implementing user-centric evaluation methods, prioritizing the patient’s experience. The teams at Chalmers University’s Interaction Design Unit and Khalifa University’s Mechanical Engineering Department collaboratively bring the necessary expertise.

 

Robotic Hands with Embodied Human like Compliance and Sensing for Soft Manipulation (Sponsored by Khalifa University)

PI: Irfan Hussain

The complexity of robotic grasping merges design, actuation, sensing, and control. My innovations have resulted in robust and intuitive robotic hands, drawing inspiration from human hand synergies. Using mathematical models and advanced materials, as shown in Fig, we’ve achieved significant advancements. I’ve notably contributed to grasping algorithms and neuromorphic event-based tactile sensing. Our team is also focusing on complex environment grasping algorithms. My goal is to leverage this technology across sectors, especially in handling delicate and irregular items in fields like agriculture and medicine.

Compliant Manipulation (2018-2023) – (Sponsored by Khalifa University)

PI: Irfan Hussain

Robotic technology has the potential to reduce costs and to make manufacturing systems more adaptable to real-world variability. Since humans are still key participants in manufacturing processes, co-working scenarios are common where robots assist, collaborate, and work with humans sharing the same workspace. This project aims to develop new compliant robotic manipulation systems targeting safe human-robot collaboration in complex industrial tasks. This project focuses on safe human-robot interactions based on novel mechanical system design, advanced sensing technology, intelligent control algorithms, and human intention prediction. A new robot manipulation system with switchable compliant and rigid working modes for safe human-robot co-working will be developed and benchmarked with state-of-the-art research and commercial manipulator systems, like the Baxter robot and KUKA iiwa.

Hubot: Houbara robot for behavioural studies in the field and sampling (Sponsored International Fund for Houbara Conservation (IFHC))

PI: Irfan Hussain

Co-I: Lakmal Seneviratne

Studying animal behaviour requires appropriate perception systems, but these can disturb natural behaviours, especially in birds. To minimize disturbances, robots mimicking certain bird species have been introduced for behavioural studies in the wild. These robots, equipped with perception systems, closely observe birds without alarming them and record behaviours in real-time. Robots can also interact with birds using movements, postures, and playback recordings, introducing new data collection dimensions like inducing mating behaviour. A unique robot application would be collecting semen from wild males, benefiting conservation programs by capturing genetic diversity without removing individuals. For this project, we aim to develop a Houbara Robot for observation, interaction, and semen collection in their natural habitats.

Artificial Intelligence for Understanding Houbara Wildlife in UAE (Sponsored International Fund for Houbara Conservation (IFHC))

PI: Sajid Javed, Naoufel Werghi

Co-I: Irfan Hussain, Lakmal Seneviratne

As part of the population monitoring in the wild, or bird assessment in captivity, multiple sensors are used by IFHC scientists, producing important amount of sensory data, such as picture and videos. Processing and analyzing such data with human operators are extremely time-consuming tasks slowing down IFHC research goals. This project propose to investigate AI-enabled computational models to analyze and understand a multitude of sensory data collected in captivity and into the natural houbara habitat and come out with relevant interpretative tasks that include bird pose, bird behavior, vegetation cover, trap picture analysis and many more.


Other projects(Houbara, Space, MBZIRC, etc)

MBZIRC 2023: The Mohamed Bin Zayed International Robotics Challenge (Sponsored by Khalifa University/ASPIRE, in collaboration with Beijing Institute of Technology (BIT))

KU Team Lead: Irfan Hussain

The MBZIRC Maritime Grand Challenge is focused on deploying robot technology for ensuring maritime security. The challenge involves a heterogeneous fleet of autonomous aerial and surface vehicles collaborating in a relatively large GNSS-denied environment along the coast of Abu Dhabi, to detect predefined targets and to retrieve predefined objects from the targets. This is a highly complex problem and the key technical challenges include:

  • Real world robot autonomy in a large, complex environment
  • GPS-denied robot navigation
  • Robot collaboration for wide area cooperative search
  • Detection of relatively small targets in a relatively large area
  • UAV interactions with the environment – grasping and manipulations.
  • USV interactions with the environment – grasping and manipulations.

We will use a robotic system consists of 20 UAVs and 1 USV (with robotic arm) to solve the MBZIRC 2024 Maritime Grand Challenge tasks. These will include several inspection UAVs (up to 10) to search the 10 square km area, in a formation, to detect the targets. One or more inspection UAV will also approach the target vessels to completes the 3D modeling. Once the targets are identified, the USV will navigate to the target vessel. A transportation UAV will take off from the deck of the USV to pick up the light object(s) on the target vessel using airborne vision and will return back to the USV deck to place the object(s). Two transportation UAVs will be used to manipulate the heavy object to the edge of the target vessel. The USV robot manipulator will then pick up the heavy object and place on the deck.

Compliant Knee Exoskeleton for Rehabilitation and Assistance of post-stroke hemiplegic patients (Sponsored by ARIC/Mubadala)

PI: Irfan Hussain

Stroke survivors often struggle with mobility, especially in the knee joint, a vulnerability affecting around 25% of adults, limiting function and life quality. Traditional rehabilitation involves repetitive limb movements assisted by therapists, demanding consistency and labor intensity. Robotic devices like Exoskeletons offer rehabilitation advantages, but face portability and actuation challenges. We introduce a compliant knee exoskeleton with stiffness modulation resembling the human knee during gait, offering effective patient assistance.

User-Defined Sixth Finger: Designing A Robotic Assistive Device to Support Stroke Patients (Sponsored by Swedish Government under STINT, in collaboration with Chalmers University of Technology)

KU PI: Irfan Hussain

After a stroke, many face challenges with hand and finger movements, impeding basic tasks like holding objects due to conditions like hemiplegia. To aid these individuals, supernumerary robotic limbs have been developed to either augment the functioning limb or provide additional support for the impaired one. This research is twofold: developing the assistive technology and implementing user-centric evaluation methods, prioritizing the patient’s experience. The teams at Chalmers University’s Interaction Design Unit and Khalifa University’s Mechanical Engineering Department collaboratively bring the necessary expertise.

 

Robotic Hands with Embodied Human like Compliance and Sensing for Soft Manipulation (Sponsored by Khalifa University)

PI: Irfan Hussain

The complexity of robotic grasping merges design, actuation, sensing, and control. My innovations have resulted in robust and intuitive robotic hands, drawing inspiration from human hand synergies. Using mathematical models and advanced materials, as shown in Fig, we’ve achieved significant advancements. I’ve notably contributed to grasping algorithms and neuromorphic event-based tactile sensing. Our team is also focusing on complex environment grasping algorithms. My goal is to leverage this technology across sectors, especially in handling delicate and irregular items in fields like agriculture and medicine.

Compliant Manipulation (2018-2023) – (Sponsored by Khalifa University)

PI: Irfan Hussain

Robotic technology has the potential to reduce costs and to make manufacturing systems more adaptable to real-world variability. Since humans are still key participants in manufacturing processes, co-working scenarios are common where robots assist, collaborate, and work with humans sharing the same workspace. This project aims to develop new compliant robotic manipulation systems targeting safe human-robot collaboration in complex industrial tasks. This project focuses on safe human-robot interactions based on novel mechanical system design, advanced sensing technology, intelligent control algorithms, and human intention prediction. A new robot manipulation system with switchable compliant and rigid working modes for safe human-robot co-working will be developed and benchmarked with state-of-the-art research and commercial manipulator systems, like the Baxter robot and KUKA iiwa.

Hubot: Houbara robot for behavioural studies in the field and sampling (Sponsored International Fund for Houbara Conservation (IFHC))

PI: Irfan Hussain

Co-I: Lakmal Seneviratne

Studying animal behaviour requires appropriate perception systems, but these can disturb natural behaviours, especially in birds. To minimize disturbances, robots mimicking certain bird species have been introduced for behavioural studies in the wild. These robots, equipped with perception systems, closely observe birds without alarming them and record behaviours in real-time. Robots can also interact with birds using movements, postures, and playback recordings, introducing new data collection dimensions like inducing mating behaviour. A unique robot application would be collecting semen from wild males, benefiting conservation programs by capturing genetic diversity without removing individuals. For this project, we aim to develop a Houbara Robot for observation, interaction, and semen collection in their natural habitats.

Artificial Intelligence for Understanding Houbara Wildlife in UAE (Sponsored International Fund for Houbara Conservation (IFHC))

PI: Sajid Javed, Naoufel Werghi

Co-I: Irfan Hussain, Lakmal Seneviratne

As part of the population monitoring in the wild, or bird assessment in captivity, multiple sensors are used by IFHC scientists, producing important amount of sensory data, such as picture and videos. Processing and analyzing such data with human operators are extremely time-consuming tasks slowing down IFHC research goals. This project propose to investigate AI-enabled computational models to analyze and understand a multitude of sensory data collected in captivity and into the natural houbara habitat and come out with relevant interpretative tasks that include bird pose, bird behavior, vegetation cover, trap picture analysis and many more.


External Collaborators
  • KAIST
  • MIT
  • King’s College London
  • Queen Mary College London
  • Coimbra University
  • SSSA Italy
  • Kingston University
  • Virginia Tech
  • UTS Sydney

External Sponsors

  • Strata
  • Mubadala
  • EMAAR
  • RTA
  • Earth
  • PAL Robotics