Unitsky String Technologies |
Site news
10 September 2019 |
|
SkyWay intelligent control system An interesting subject led us to one of the offices of SkyWay design organization for a conversation with a specialist in the field of artificial intelligence and automated control systems Yuri Sorokin, PhD in Engineering Science. We came to Yuri to talk about the safety of SkyWay transport complexes. At the same time, we learned about the difference between artificial intelligence applied in SkyWay and unmanned vehicles, the way unibuses communicate with each other and what movie is worth watching if you want to know about artificial intelligence development. About the need for intelligent systems in detail Artificial intelligence (AI) is too idealized. Largely due to marketing. The development of AI today is at an initial stage. When man decided to train a machine to analyze information and make decisions, he began by adding certain algorithms into it. The result depended on the ability to foresee everything. It was obviously impossible. Therefore, there arose the need to teach the computer to think, to reason, to be able to adapt. Professionals all over the world are now trying to implement it. But still computer cannot do without a human. In what does artificial intelligence look like a child? Somewhere in an ideal universe, children can learn from their mistakes. They explore the world themselves, and the cost of their mistakes is incommensurably small. However, it’s in a perfect universe. And in the real world, you wouldn’t want your son to stick his fingers in the socket and find out that electricity hurts painfully. The same situation is with artificial intelligence. For example, there is a machine, there is a labyrinth. A small car is started up there, so that it would look for the exit. The car drove into a dead end, remembered that there was no exit there and returned back. The cost of such an error is small the risk is appropriate. But, suppose, an obstacle appears before the car in the process of movement. For example, a truck standing on the roadway, braking distance to which is not enough. The car, tuned to strict compliance with traffic rules, will face a challenge: to cross a full-line and go out to a free lane of oncoming traffic or slow down and allow a collision with possible consequences for passengers’ health. Self-learning car can make a choice: not to violate traffic rules, and then, according to the results of consequences evaluation, it will conclude that the decision was wrong. But you would hardly like to become a laboratory guinea pig and to train a car at the expense of your health. Therefore, self-learning of AI is only possible when the consequences of making a wrong decision are small. Accordingly, the role of man in the construction of systems with AI is undeniably high. Initially, it is he who should teach this system. And the quality of training will depend directly on the skill of developers, programmers, on everyone who teaches it. The car must be under control at all times. How do we teach SkyWay “transport”? Let’s take, for example, the facial recognition system installed in SkyWay transport complex. In order for it to learn how to detect cases of health deterioration, acts of vandalism or violence, unwanted passengers, it had to learn to recognize faces initially in principle. It was offered a huge number of images of people and situations, including illegal ones. The system identified certain patterns in this way to eventually identify people to ensure access control to the services of the transport complex and detect situations with forgotten things or fights. Training of string transport goes on constantly. Information for analysis is recorded during all races, regardless whether testing of AI or suspension system is carried out. Therefore, the dataset is continuously updated. Drone manufacturers are also constantly collecting information for AI training from their cars as they cruise along the streets. However, in their case, it is necessary so as to teach a car to react to all the dangers encountered on the way, and to protect against threats from the outside. In the case with SkyWay transport, AI has more humane goals. Having raised the track to the second level, we have got rid of external threats. Therefore, the AI learns to monitor safety inside the transport and calculate the fastest and most convenient route tasks. When unibuses go into mass production and appear on the streets of cities, AI will also continue to learn, but this will never be a threat to human safety. Comparing string transport and drones Both drones and SkyWay transport systems use several types of sensors: optical (cameras) and radars. What are their benefits and disadvantages? Optical sensors with well-trained AI can easily recognize objects. However, they are still far away from the man’s level. After all, the camera sees no more than a set of pixels. Although it can detect a specific object from a given list in the crowd faster. But the camera measures distance and speed badly. Another drawback is that it is very critical to external conditions (light, rain, fog). Yes, there are various ways to deal with them, but they significantly increase the cost of the surveillance system. The second sensor, which is widely used to detect objects, both in string transport and in most drones, is radar. It is considered to be all-weather, perfectly measuring the distance to an object, its motion parameters. But because of the small volume of classifying features it almost does not identify detected objects. And it perceives everything as an obstacle. During the tests at the test site, we see a reflection from everything: grass, supports, infrastructure. And if the vehicle thought it were traffic obstacles, it would remain to stay still. To improve the information received from cameras and radars, we have created programs that combine data from optical and location sensors. You could see them in action during the demonstration of Anatoly Yunitskiy’s “doggie”. We have taught the AI to supplement one information with another. So as to make the best decisions based on these data and to ensure safe movement of vehicles. The third type of sensors used by some drones, but not by SkyWay Technologies Co., are lidars. A thin laser beam scans the area and gets excellent optical reproduction. At the same time, it defines the motion parameters well. It would seem that everything is fine, except for some “but”, because of which we don’t use lidars for string transport. First, it is the high cost. Secondly, a lidar is a mechanical device that has the ability to break. Third, it is an optical device. Therefore, in the rain or fog, when the laser beam is dispersed, it sees a wall. In our opinion, it is better to use lidar in California. Taking into account that we are orienting on a diverse market and testing transport in Belarus, where rains and fogs are a natural practice, the use of lidars is impractical for us. Almost SkyNet As a rule, each unmanned vehicle is considered as an independent unit. It is self-sufficient, with certain parameters and preset information, such as a map of an area. And the movement of such units is chaotic, which, in its turn, leads to low throughput capacity. In SkyWay, each element of the infrastructure is also endowed with “intelligence and quick wits”, plus there is also a central intelligent control system (CICS). It controls the entire transport complex in accordance with the posed tasks. It monitors the speed and safety of movement, combines traffic flows and responds to the wishes of each individual client to ensure maximum comfort for him. CICS calculates the most convenient and optimal route tasks with constant updating of incoming data. Another difference between SkyWay intelligent systems and unmanned vehicles is the ability to respond quickly to emergencies. Failure of any vehicle results in an instant recalculation of the route assignment for the rest of the pods linked with this route. And also in acceptance of measures needed to solve the current situation: evacuation, transportation, etc. That is difference of our system from competitors’ systems is the global traffic flow control without compromising the individuality of each separate vehicle. What do vehicles say? However, communication between vehicles is not limited by the centralized one. After all, the failure of one of its elements could lead to a collapse. Therefore, each vehicle communicates with all elements of the transport infrastructure. Let’s assume the impossible that CICS ceased to function. Each individual pod knows its task, communicates with other pods that are nearby, and relays information about the environment. Therefore, all vehicles can continue running along the route and respond to any changes regardless of CICS operation. Intelligent control system in action Development of the technology of vehicle evacuation is an example of how vehicles executed a task from CICS. But they also communicated directly during the docking. “Doggie” is another specific example of work that reveals the capabilities of our systems the movement of vehicles in a virtual coupling. From the safety point of view, the distance between the vehicles should depend on the braking distance, which is necessary to stop the second one when the first stops. At the same time, it is desirable that it would be as much as possible. On the other hand, to increase passenger traffic it is necessary to reduce this distance. How can a balance be found? As regards the virtual coupling, we are not the pioneers of this research. The work was carried out by our other competitors. However, we have applied and modernized this concept. The first vehicle is the leading one, it performs its route task. The rest ones control and change their movement parameters: distance, interval, time to the module in front. Why do we need virtual coupling? For example, there is a task: to service goods and passengers on the route Minsk-Moscow as quickly as possible. To solve it, we will need to maintain a minimum interval between modules. And we will manage to achieve this only through a virtual coupling. The task from CICS is to make up a virtual coupling depending on the existing traffic schedules and individual needs of passengers. The Head of Intelligent Systems Administration of SkyWay Technologies Co. Evgeny Rodchenkov appeared in the office. He gets interested in the topic of conversation and complements Yuri Adamovich: “By the way, in terms of commercial benefits, a virtual coupling will reduce energy consumption, increase passenger traffic, and, correspondingly, profit. It will help to reduce the load when driving in a stream, as it works in the case of trucks, to reduce energy consumption and increase passenger traffic”. Yuri Adamovich continues to explain the technical side of the issue. “Calling the unibus a “doggie”, Mr. Yunitskiy has not only demonstrated adaptive following after a leader that is needed to implement the virtual coupling. He also showed the execution of some commands, quite unusual for the system: escape, pursuit. You see, when Mr. Yunitskiy said that the problem solved by our company, is very serious, he did not boast and palter at all. We are indeed in the lead on this issue”. What does unibus see? Returning to the issues of technical vision, let me remind you how Anatoly Yunitskiy said at one point of the demonstration, “The system sees a lot of unnecessary objects and interferences.” Because, unlike smooth asphalt, we worked on an imperfect surface. And the fact that the system developed by us has learned to select the necessary objects and process the necessary information at the level of surface distortions, speaks of tremendous success. If there were other people in the way of unibus during the tests, it would work them out on the principle of the greatest potential threat, calculating the intersection of their routes based on the parameters of the movement of both the vehicle and potentially dangerous objects. After all, the nearest can move away, and standing in the distance to approach. Intelligent systems against vandalism Our technical vision system already recognizes a weapon inside and outside the vehicle, sends a signal “potentially dangerous situation detected” to the dispatcher. It is he who must take further action. After all, man still plays the most important role, because the cost of failure will be quite high if, say, a person appears in the cabin not with a toy gun, but with a real one. And we will allow the artificial intelligence to choose what to do. However, it is not easy to recognize acts of vandalism and potentially dangerous situations, such as an aggressive behavior. AI will have to learn to distinguish a simple slap on the shoulder from a blow or a greeting gesture from a threat. But we’re using huge databases of datasets so that vehicles would be finally able to do it. Evgeny Rodchenkov adds, “There is a film in which not a person teaches a robot, but vice versa. It’s called “I'm a mother.” Its moral is how the decision is chosen, why in this way, and who is still smarter a man or a robot." |
© 19772019 Anatoly Yunitskiy. All Rights Reserved. |