Prof Zhaojie Ju >> School of Computing

  • An upper-limb rehabilitation exoskeleton robot based on AI and cloud computing (AiBle)

    AiBle – an upper limb rehabilitation exoskeleton robot based on artificial intelligence and cloud computing - will use new technology to rapidly improve the rehabilitation of stroke survivors. The innovative robot device will enable remote interventions that are more intuitive and offer better treatment outcomes.  The project has a total budget of €4.8m, of which €3.3m is funded by the European Regional Development Fund via the Interreg France (Channel) England Programme.The project titled AiBle is led by the University of Portsmouth, working in collaboration with eight other partner organisations across academia and industry in France and England. Drawing on cutting-edge technology in artificial intelligence, virtual reality, cloud computing and exoskeleton control, it will develop the first robotic exoskeleton to help stroke patients regain strength in their upper limbs, which can be controlled remotely. (More details on the news and webpage.)  


  • Dexterous Humanoid Robot Motion Learning

    An ideal interface for humanoid robot control would be inexpensive, person-independent, easy to use, requiring no wearable equipment,and more importantly able to achieve the goal. This project aims to develop a novel goal-directed online learning method and build up a human-robot interactive demonstration system, which would enable people to easilycontrol and interactwith humanoid robots using the most natural body gestures. The robot platforms include Baxterrobot and Sawyer robot.

  • Intelligent Human-Robot Interaction

    This project aims to improve the human-robot interaction via body/hand gestures using state-of-the-art sensors. It will enable robot to real-time recognise human body gestures, especially the complicated hand gestures, and identify the users' intention, which could be of great help in the applications of the healthcare robot and service robot.

  • DREAM Project

    DREAM is an EU-funded project that will deliver the next generation robot-enhanced therapy (RET). It develops clinically relevant interactive capacities for social robots that can operate autonomously for limited periods under the supervision of a psychotherapist. DREAM will also provide policy guidelines to govern ethically compliant deployment of supervised autonomy RET. The core of the DREAM RET robot is its cognitive model which interprets sensory data (body movement and emotion appearance cues), uses these perceptions to assess the child’s behaviour by learning tomap them to therapist-specific behavioural classes, and then learns to map thesechild behaviours to appropriate robot actions as specified by the therapists. More details on official website

  • Human Hand Motion Analysis With Multisensory Information

    Different propertiesinvolved in the human hand motions provide people with rich sensory information such as hand position, velocity, force and their changeswith the time to build up computational models of these motions.Integration of the above multiple sensory information is essential for the human hand motion analysis.

    · Motion capturing module: use different sensors to transfer the sensory information into digital signal recognisable to computers.

    · Preprocessing module: synchronise and filter the original digital data and segment them into individual tasks.

    · Knowledge base module: stores the human hand motion primitives, manipulation scenarios and correlations among the different sensory information.

    · Identification module: use clustering and machine learning methods to train the motion models and recognise the new or testing sensory information.

    · Desired trajectory generation module: generate the desired trajectories based on the human analysis framework for different applications.

    · Applications: Robotic hands, Prosthetic hands, Animation Hands, Human-Computer Interaction and so on.

  • Mobile, Autonomous and Affordable System to Increase Security in Large Unpredictable Environment

    Civil installations such as power plants are often located in wide and remote areas. In the future, the number of small distributed facilities will increase as a direct result of new European environmental policies aimed at increasing societies’ resilience to climate change. However, the protection of fragmented assets will be difficult to achieve and will require portable security.

    systems that are affordable to those in charge of their management. The BASYLIS project (supported by European Seventh Framework Programme) aims to address these issues by developing a low-cost smart sensing platform that can automatically and effectively detect a range of security threats in complex environments. The principal obstacles to early threat detection in wide areas are of two types: functional (e.g. false-alarm rate) and ethical (e.g. privacy). Both problems are exacerbated when either the installations or the environments are dynamic. Potential solutions are unaffordable to most of the potential users.

    In the work package 6, our projectives are to develope:

    · Multitracker: This software module will integrates all the sensors alarms, including the COTs ones thought the COTs Integration Board.

    · Behavioural Analysis: The behavioural analysis automates the identification and classifi- cation of suspicious behaviour of objects from the multitracker.

  • Mobile, Autonomous and Affordable System to Increase Security in Large Unpredictable Environment

    This project aims to develop a realtime platform for transfering human manipulation skills to prosthetic hands. The project is funded by Higher Education Innovation Fund 4 (HEIF4)