Rushan Zhiatdinov

In March 2023, I published an article in The Korea Times titled “ChatGPT Utilization and the Future of Academia.” In that article, I predicted that we would see several new AI-based tools and technologies in the future. Despite criticism that AI tools cannot produce very novel concepts and sometimes even hallucinate, I remain optimistic about the future of applied AI. That said, one of my concerns is that AI and new technologies should augment our activities as intelligent assistive technologies rather than taking over human jobs.
It's a stark fact that millions of people around the world are visually impaired and live in a world without light. But now there are free apps, such as Seeing AI, developed by Microsoft, that allow users to take a photo with their smartphone and get a description of the image by converting it into text. What's more, smart glasses developed at the California Institute of Technology, Pasadena, can translate images into speech that can be intuitively understood without any training.
As these technologies evolve, they could be integrated into small wearable cameras that can be worn on the body, 360-degree cameras, or even built into shoes to help the blind and visually impaired fully understand their surroundings through hearing rather than sight, which could help them understand objects, colors, and even human emotions.
AI-based accessibility and assistive technologies can be used for real-time captioning for the deaf and hard of hearing (providing live captions for videos and during real-world interactions such as lectures, meetings, and public announcements) and visual description for the blind (converting images into descriptive text to help visually impaired users understand the content of an image or video).
Similar tools could potentially be used to train blind animals, although the exact number of such animals is unknown. Animals are known to communicate through a variety of signs, including sounds and movements. In the future, AI could help decipher animal language and understand the meaning of animal communication. Various AI tools can be utilized to facilitate communication and training with animals living in a world of darkness.
According to the BBC, it's estimated that up to 7,000 different languages are spoken around the world. What about dogs? Do they speak the same language, or do different breeds have their own? Can a South Korean Jindo understand a North Korean Pungsan dog? Good question. Despite the vast differences between human language and animal communication, it's possible that scientists could use AI to answer questions about animal communication and understanding.
Intelligent image or video-to-text conversion technology could potentially be integrated into the car's black box cameras and more advanced lidar (or light detection and ranging sensors). Such developments could benefit car drivers by providing additional assistance in avoiding various hazards, such as pedestrians crossing the road at undesignated crosswalks or animals crossing the road at night. In some cases, relevant information could also be sent by the car driver to various navigation services such as Naver or Kakao Maps.
Ultra-intelligent cameras can integrate various AI techniques related to text, voice, gesture, image and video. Could they help life scientists to conduct natural experiments and generate automated reports? For example, if the experiment is too long or dangerous, this could be a useful tool for wildlife monitoring. This could include, for example, studying the population of wild boars in the forest, tigers and lions in the savannah, or discovering different marine creatures living in the deep seas that are not yet known to science.
Animal behavior may be affected by climate change, and understanding how animals communicate with the help of AI can help scientists understand how these changes affect animal communication, migration patterns, and overall ecosystems, allowing them to develop better strategies for conservation and climate adaptation.
Speaking of wild animals, I sometimes take a walk in a small city park with trees early in the morning, during which I hear many beautiful bird calls. I am grateful to the birds for these beautiful melodies. It is possible that the birds are communicating with me or with each other, but I am not sure. There does not seem to be much consideration for wild birds in Korean cities in terms of feeding them, so I do not know where they feed.
In the future, if AI can help us understand bird calls, what insights will we gain from birds and what requests will they make of us? Will “wild” birds complain about wild humans trying to drive them out of their habitat?
Rushan Ziatdinov (www.ziatdinov-lab.com) is a professor in the Department of Industrial Engineering at Keimyung University in Daegu. He can be contacted at ziatdinov.rushan@gmail.com.
