Prof. Dr. Oliver Bendel

Professor Dr Oliver Bendel is Professor of Information Systems, Information Ethics, Machine Ethics and Robophilosophy.

Photo: Kai R. Joachim

Thinking Space Technology

Help for Professional Carers: Robots in Everyday Care

Professor Dr Oliver Bendel teaches and researches at the University of Applied Sciences and Arts Northwestern Switzerland. His key areas of research include information ethics and machine ethics. In the field of information ethics, he discusses, among other things, the opportunities and risks involved in using information and communication technologies, AI systems and robots. In machine ethics, he investigates ideas connected with adding moral behaviours to machines and deals with the interactions with robotics and artificial intelligence (AI).

Prof. Dr. Oliver Bendel

Professor Dr Oliver Bendel is Professor of Information Systems, Information Ethics, Machine Ethics and Robophilosophy.

Photo: Kai R. Joachim

Mr Bendel, how do you assess where we are at the moment with robots in the different areas of the care sector?

Oliver Bendel: Care robots exist primarily as prototypes. There are small batches of a few robot models. Some care robots can bring – and remove – things; others are able to inform and entertain patients. A prototype called Robear, which is not being further developed at the moment, can lift and re-bed patients, but only with the support of a carer. In general, it functions particularly well where standardised procedures are in place and simple actions are required, for example, when it comes to reminding patients of appointments or delivering medication. With regard to more complex activities, we haven’t got very far – not least for technological reasons.

Obviously, robots would be a welcome addition to daily life in a professional care environment and could be used to support care staff. But how can they benefit patients?

Bendel: In many cases, care robots can improve patients’ individual autonomy. The patient is no longer exclusively dependent on people, be they carers or relatives.

Please wait, the image is currently loading and will be there shortly.

Surgical robots are either controlled by doctors or interact autonomously.

Photo: Fotolia / zapp2photo

How might such a cooperation between human and robot function?

Bendel: Many care robots are designed as assistance systems that work in tandem with an individual or as part of a larger team. They’re not meant to replace carers, but to support them. I think that’s a good idea. Some tasks are very strenuous or even dangerous. This is where robots can help. But we shouldn’t only be considering the carers, but also need to look at the patients themselves. They should be allowed to decide if they actually want to have a robot by their side.

The use of care robots is not entirely uncontroversial, nor is it a matter of universal acceptance in the care sector. What problems or challenges do we as humans face when we interact with care robots? What controversies arise?

Bendel: Some people fear the establishment of a machinery of care, of the predominance of artificial intelligence and the general loss of human contact. Thirty years from now, it might well be the case that patients will only rarely see human carers. This could of course be prevented. Besides, many perceive mobile robots as potential spies. After all, the robots have cameras and sensors that they can use – but also misuse – in a variety of ways.

What responsibility do we humans have when it comes to dealing with robots?

Bendel: First of all, the moral responsibility always lies with us humans. The problem with robots can be that it’s not always possible to determine beyond doubt who is morally or legally responsible. It starts with the fact that hundreds of people may be involved in the production and operation of robots. That’s why constructs like the electronic person have been thought up. From a legal point of view, it is perfectly possible to attach legal liability to robots. But this still fails to create responsibility in the moral sense.

Are we paying too little attention to ethical issues when it comes to questions of digitalisation?

Bendel: Actually, at the moment, I’ve got more of an impression that people are overly concerned with ethical questions. We should focus more strongly on the legal perspective: in some areas of application, laws must be enforced or even created in the first place.

Do robots need rights?

Bendel: Robots have no rights. They have no capacity for feeling or suffering, no conscious mental state, no will to live. But they would need to have such qualities in order to acquire rights. They don’t have any more rights than stones. If there were ever to be a robot that could feel and suffer, it would have to be granted rights. But, for the time being, there’s no such thing in sight.

Do I as a patient have an obligation to treat my robot well?

Bendel: You can treat your robot as you choose. It feels nothing; it doesn’t suffer; it doesn’t care. The robot is not a moral object. Of course, it could be the case that you yourself become brutalised over time if you keep mistreating it. But that’s your problem, not your robot’s. You yourself are the moral object. It goes without saying that you shouldn’t damage a robot that belongs to someone else. Because this could cause another person to suffer, whereby he or she in turn would be a moral object. Or this other person would simply suffer a financial loss.

Please wait, the image is currently loading and will be there shortly.

Therapy robot Paro is used in dementia therapy.

Photo: Tekniska museet / Peter Häll (CC-BY-SA 4.0 / https://creativecommons.org/licenses/by-sa/4.0/deed.de)

Would you personally let yourself be cared for by a robot?

Bendel: I can well imagine a robot supporting me in my old age, alongside a person I like or who does his or her job well. What I would like best of all – so long as I’m still in my right mind – would be to be able to order a robot around in a targeted way to do this and that. On the other hand, I wouldn’t want to be fobbed off with a therapy robot when I’m no longer in my right mind.

How does Germany compare internationally when it comes to using medical robots?

Bendel: Surgical and therapy robots are widely used in Germany – just as they are in Switzerland. Care robots aren’t – neither here nor anywhere else in the world. Many places in Germany lack the necessary technological infrastructure. This is also a problem for the use of robots and artificial intelligence.

In Germany, we still have a lot of catching up to do with regard to robots in the health sector. How do you think we’ll be using care robots 20 years from now?

Bendel: In two decades, hospitals and care homes will be using both specialist and generalist robots. However, I don’t think we’ll be seeing a flood of them. Those who can afford it will have a care robot at home. By then, it will be able to do a lot of the things it’s incapable of doing today: it’ll be able to dress and undress people, and feed and wash them. And it might even have sexual assistance functions. Too little attention is paid in the field of care and assistance to the issue of sexual satisfaction. However, I’d like to see humans rather than robots solving this problem.


Sources of definitions:

Definitions according to Professor Dr Oliver Bendel in the Gabler Wirtschaftslexikon:

https://wirtschaftslexikon.gab...

https://wirtschaftslexikon.gab...

https://wirtschaftslexikon.gab...

Fraunhofer Institute for Manufacturing Engineering and Automation IPA:

https://www.ipa.fraunhofer.de/de/Kompetenzen/roboter--und-assistenzsysteme/haushalts--und-assistenzrobotik/assistenzroboter-fuer-den-demographischen-wandel-aal.html