Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
For example, a sweeping robot with a camera is distinguished from other environment sensing sensors (such as laser radar, infrared, ultrasonic and other sensors), the camera is equivalent to an eye of the robot, and due to the limitation of the visible range of the camera (the size of the visible range is determined by the angle of view and the visible distance of the camera), a certain blind area generally exists, and an object in the blind area is invisible to the sweeping robot at the current moment. As shown in fig. 1A, the shadow range is the visible range of the camera and the visible range of the robot, and the other areas are blind areas, and the angle and the depth of the blind areas are determined by the angle of view of the camera and the installation position on the robot. As can be seen from fig. 1A, the visual range of the camera has a main direction, the farther away from the main direction, the smaller the visual range of the robot, and the main direction of the visual range is the traveling direction of the robot during the linear traveling of the robot.
In order to avoid the problem that the robot traveling efficiency is low due to the fact that continuous pause pivot steering is needed in the process of turning around once, as shown in fig. 1B, when the sweeping robot travels to a turning point, the sweeping robot turns around on the spot and turns 90 degrees, then travels to the next adjacent straight navigation line according to an arc-shaped track, and turns around to travel once. However, in the current arc-shaped track turning mode, the robot needs to pause and turn around in situ, so that the robot is jammed in the process, and the operation efficiency is low. And the whole turning track direction always deviates from the main direction and falls into the blind area of the robot, so that the probability of collision between the sweeping robot and objects in the blind area is higher.
When the sweeping robot carries out sweeping, the adopted bow-shaped sweeping means that the sweeping robot turns 90 degrees in situ and then travels for a certain width after traveling along a straight travel route to a turning point, then turns 90 degrees again to enable the current traveling direction to be opposite to the traveling direction of the original straight travel route, and then continues to travel to the next turning point.
In order to solve the problems, in the process of turning around the robot, the robot turning around control method is improved in order to avoid the problem that the robot is stopped and turned around originally.
Fig. 2A is a flowchart of an embodiment of a robot turning control method according to an exemplary embodiment of the present application, where the present embodiment takes how to plan a turning route before a robot travels as an example, and as shown in fig. 2A, the robot turning control method includes the following steps:
step 201: when the robot detects that there is an obstacle on the first straight line, the distance between the current position and the obstacle is determined S1.
In the present application, if there is no obstacle in front of the robot, the robot will usually travel along the straight course until an obstacle is detected on the straight course, and a turning course needs to be planned to travel from the current straight course to the next straight course.
Based on this, the robot may utilize data collected by various sensors provided to construct an environment map, where the environment map includes both an area that the robot has traveled and an area that the robot has not traveled, and a current position of the robot, a first straight line where the robot is located, and an obstacle detected by data of the obstacle sensor are marked on the environment map, and if the robot detects an obstacle on the first straight line, it indicates that the robot cannot travel all the way along the first straight line, it is necessary to determine a distance S1 between the current position and the obstacle, so as to plan a turning route.
The obstacle can be any object such as a wall, a table, a person, etc. which hinders the robot from advancing. The sensors arranged on the robot can comprise a mileage sensor, a code wheel sensor, an obstacle sensor, an image sensor and the like. The obstacle sensor may be a laser sensor, a radar sensor, or the like.
Step 202: and determining a turning route for turning from the first straight travel route to the second straight travel route according to the distance S1 and the route distance S2 between the first straight travel route and the second straight travel route.
The lane distance S2 between the first straight lane and the second straight lane may be preset, that is, the distance between every two adjacent straight lanes is S2.
According to the turning route shown in the figure 1B, when the robot travels to a turning point along a straight travel route, the robot needs to stop on site, and can continue to travel according to an arc-shaped track after turning around and turning 90 degrees. The reason why the turning point needs to be stopped and turned is that derivatives of the turning point on the straight travel line and the arc-shaped track are different, and a motion control module of the robot cannot realize smooth transition to the arc-shaped track in the running process, so that the robot needs to be controlled to stop firstly, and can transition to the arc-shaped track after turning around for 90 degrees.
Based on the analysis reasons, the turning route determined by the method is composed of arcs formed by a first turning point on the first straight travel route and a second turning point on the second straight travel route, the derivative of the first turning point on the first straight travel route is the same as that of the turning route, and the derivative of the second turning point on the second straight travel route is also the same as that of the turning route, so that the robot can be enabled to smoothly transit from the first straight travel route to the turning route and then smoothly transit from the turning route to the second straight travel route.
In this embodiment, when the robot detects that there is an obstacle on the first straight course, the robot determines the turning course which turns from the first straight course to the second straight course according to the distance S1 from the obstacle and the course distance S2 between the first straight course and the second straight course, and since the derivative of the first turning point on the first straight course is the same as that on the turning course and the derivative of the second turning point on the second straight course is the same as that on the turning course, the robot can smoothly transition from the first straight course to the turning course during driving and then smoothly transition from the turning course to the second straight course, and the robot can realize transition without stopping turning on site, and has smooth driving and high operating efficiency.
In an embodiment, the turning path may comprise a first arc-shaped trajectory between the first turning point and a first intermediate point, a first straight-line trajectory between the first intermediate point and a second intermediate point, and a second arc-shaped trajectory between the second intermediate point and the second turning point.
The derivative of the first turning point on the first straight line and the derivative of the first turning point on the first arc-shaped track are the same, the derivative of the first middle point on the first arc-shaped track and the derivative of the first middle point on the first straight line and the derivative of the second middle point on the second arc-shaped track are the same, and the derivative of the second turning point on the second straight line and the derivative of the second turning point on the second arc-shaped track are the same. Therefore, the derivatives of each point on the two tracks to which the point belongs are the same, so that the robot can smoothly transit to the next track when reaching a certain point in the turning process, and the robot cannot be stuck, runs smoothly and has high running efficiency.
In the present application, a distance between the first intermediate point and the second intermediate point is a preset distance S3, and a center point between the first intermediate point and the second intermediate point is located on a center line between the first straight flight path and the second straight flight path, and the first straight trajectory is perpendicular to the first straight flight path and the second straight flight path. Based on this, the positions of the first intermediate point and the second intermediate point can be determined according to the distance S1, the lane spacing S2 and the preset distance S3.
Wherein, the length of the first straight line track between the first intermediate point and the second intermediate point is S3, and the value range is: s3 is more than or equal to 0 and less than or equal to S2.
In addition, the longitudinal distance between the first turning point and the first middle point and the longitudinal distance between the second turning point and the second middle point are both preset distances S4, so that the positions of the first turning point and the second turning point can be determined according to the distance S1, the lane spacing S2 and the preset distance S4.
Wherein, the value range of the longitudinal distance S4 is: s4> 0.
The following describes the determination of the turning route in detail in an exemplary scenario:
as shown in fig. 2B, a coordinate system is established with the center of the robot as an origin, the horizontal direction as a horizontal axis, and the vertical direction as a vertical axis, assuming that the shape of the robot is a circle with a radius of r, the point a is a first turning point, the point B is a first middle point, the point C is a second middle point, and the point D is a second turning point, the distance between the obstacle detected on the first straight-line route and the current position of the robot is S1, in order to avoid the robot colliding with the obstacle on the turning route, the distance between the first straight-line trajectory BC and the obstacle needs to be at least r, so that the point B positions (0.5(S2-S3), (S1-r)), and the point C positions (0.5(S2+ S3), (S1-r)) can be obtained by using the distance S1, the route spacing S2, and the preset distance S3; the A-point position (0, (S1-r-S4)) and the D-point position (S2, (S1-r-S4)) can be obtained by using the distance S1, the lane spacing S2, and the preset distance S4.
Thus, the relationship for the first straight-ahead path is:
x=0
wherein y is more than or equal to 0 and less than or equal to (S1-r-S4)
The relationship of the second straight travel route is as follows:
x=S2
wherein y is more than or equal to 0 and less than or equal to (S1-r-S4)
The relation of the first linear trajectory BC is:
y=S1-r
wherein x is more than or equal to 0.5(S2-S3) and less than or equal to 0.5(S2+ S3)
From the above relation, the derivative of point a on the first straight-line path is + ∞, the derivatives of points B and C on the first straight-line trajectory BC are 0, and the derivative of point D on the first straight-line path is + ∞. Since the first arc-shaped locus is between the point a and the point B, the derivative of the point a on the first arc-shaped locus is also + ∞, and the derivative of the point B on the first arc-shaped locus is also 0, the first arc-shaped locus is a quarter of an ellipse. It can be seen that the center of the first arc-shaped trajectory is O1(0.5(S2-S3), (S1-r-S4)), and the center of the second arc-shaped trajectory is O2(0.5(S2+ S3), (S1-r-S4)) based on the same derivation principle.
Further, the relationship of the first arc trajectory is:
wherein x is more than or equal to 0 and less than or equal to 0.5(S2-S3), (S1-r-S4) and y is more than or equal to y (S1-r)
The relationship of the second arc-shaped trajectory is:
wherein x is more than or equal to 0.5(S2+ S3) and less than or equal to S2, (S1-r-S4) and y is more than or equal to y (S1-r)
It should be noted that, when the length S3 of the first straight line trajectory between the first middle point and the second middle point is 0, it indicates that the first middle point and the second middle point are merged into one point, as shown in fig. 2C, point B and point C are merged into one point, and the derivative of the point on the first arc trajectory and the derivative of the point on the second arc trajectory are the same, so that the first arc trajectory and the second arc trajectory may also be merged into one arc trajectory.
Compared with the turning route shown in fig. 1B, although the present application also has an arc-shaped trajectory between the first turning point and the second turning point when S3 is 0, the derivatives of the first turning point and the second turning point on the two trajectories to which they belong are the same, and the derivatives of the first turning point on fig. 1B on the two trajectories to which they belong are different.
Therefore, compared with the prior art, even if two arc tracks are combined into one arc track, the turning around of the arc tracks is not required to be carried out in a pause pivot steering mode.
In this embodiment, since the turning path is composed of the first arc-shaped trajectory, the first linear trajectory and the second arc-shaped trajectory, in the turning process of the robot, since the main direction of the camera visible range is consistent with the traveling direction of the robot in the linear traveling process of the robot, the traveling direction of the robot on the linear trajectory between the two arc-shaped trajectories is consistent with the main direction of the camera visible range, and only the traveling directions on the two arc-shaped trajectories deviate from the main direction, it can be ensured that part of the turning trajectory still falls within the visible area of the robot, and further the probability of collision with an object in a blind area can be reduced. And because the derivatives of the four points on the turning route on the two tracks to which the four points belong are the same, the robot can reach any point without stopping and turning on the spot to realize transition, the running is smoother, and the running efficiency is high.
Fig. 3A is a flowchart illustrating another embodiment of a method for controlling turning around of a robot according to an exemplary embodiment of the present application, where the present embodiment takes how the robot travels according to a turning route determined in any one of the embodiments of fig. 2A to 2C as an example, and as shown in fig. 3A, the method for controlling turning around of a robot further includes the following steps:
step 301: the robot travels along a first straight course to a first turning point.
Step 302: and driving from the first turning point to the second straight-ahead route along the turning route.
The following describes in detail a scenario in which the turning path includes a first arc-shaped trajectory, a first straight-line trajectory, and a second arc-shaped trajectory as an example:
as shown in the path diagram of fig. 3B, the line with an arrow indicates the traveling direction of the robot, a first arc-shaped track is formed between the point a and the point B, a first straight-line track is formed between the point B and the point C, and a second arc-shaped track is formed between the point C and the point D.
When the robot travels to a point A on the first straight travel route, the robot travels from the point A to a point B along the first arc-shaped track, travels from the point B to a point C along the first straight track, travels from the point C to a point D along the second arc-shaped track and then travels to the second straight travel route, and so on. Since the derivatives of the point A, the point B, the point C and the point D on the two tracks respectively belong to the same point, the robot can smoothly transit to the next track from each point without stopping pivot steering.
In the turning process of the robot, because the main direction of the visible range of the camera is consistent with the advancing direction of the robot in the linear advancing process of the robot, the advancing direction of the robot on the linear track between the two arc tracks is consistent with the main direction of the visible range of the camera, and only the advancing directions on the two arc tracks deviate from the main direction, so that the robot can ensure that part of turning tracks can still fall in the visible area of the robot, and further the probability of collision with objects in a blind area can be reduced. And because the derivatives of the four points on the turning route on the two tracks to which the four points belong are the same, the robot can reach any point without stopping and turning on the spot to realize transition, the running is smoother, and the running efficiency is high.
It is noted that the robot traveling along the first straight course and traveling along the turning course upon reaching the first turning point is performed when the robot does not detect an obstacle affecting the turning course during traveling.
In one embodiment, in the process of traveling of the robot, obstacles around the robot are obtained, the outline of the obstacles is expanded according to a preset size to obtain a target outline, and if the target outline and the traveling route of the robot have two intersection points, the traveling route of the robot is updated according to the two intersection points so as to avoid collision between the robot and the obstacles in the traveling process.
The preset size can be set according to actual requirements, for example, when the shape of the robot is circular, the preset size can be set to be the radius of the robot. If the target contour and the driving route have two intersection points, the robot is likely to collide with the obstacle in the process of traveling according to the current driving route, and therefore the driving route of the robot needs to be updated according to the two intersection points.
For example, during the driving process of the robot, the obstacle sensor continuously detects surrounding obstacles and marks the surrounding obstacles in the environment map, so that the robot can acquire the obstacles around the robot from the constructed environment map. If the obstacle is large, the obstacle sensor will detect different positions in different directions, and therefore the edge of the area formed by these positions is the outline of the obstacle.
Wherein, two intersections of the target contour and the driving route include three conditions: the first is that two intersection points are both positioned on the first straight navigation line or the second straight navigation line; the second is that two intersection points are both positioned on a turning route; the third is that one intersection point is positioned on the turning route, and the other intersection point is positioned on the first straight navigation line or the second straight navigation line.
In an embodiment, for the process of updating the driving route of the robot according to the two intersection points, the driving route of the robot between the two intersection points may be replaced by a segment of arc on the target contour, which is composed of the two intersection points.
As shown in fig. 3C, after the contour of the obstacle around the robot is expanded, the intersection points of the contour and the turning path a-B-C-D of the robot are M and N, and the driving path of the robot between the M point and the N point can be replaced by an arc composed of the M point and the N point on the target contour.
It should be noted that, in order to better smooth the driving route of the robot and increase the stability and efficiency of the robot motion, after the driving route is updated, the track smoothing method may be used to smooth the updated driving route, that is, smooth the uneven inflection points, so as to increase the stability and efficiency of the robot in the moving process and make the robot operate more smoothly.
The minimum turning radius of the robot can reach zero, so that the kinematic constraint of the robot is not required to be considered when the track is smooth. The uneven inflection point refers to a point in the driving route where the trajectory is discontinuous and the derivative is not present. In the present application, the intersection with the travel route is an uneven inflection point.
Those skilled in the art will appreciate that the trajectory smoothing method may be implemented by related technologies, and the trajectory smoothing method is not particularly limited in the present application.
For example, as shown in fig. 3C, the u-turn trajectory is changed into a first arc-shaped trajectory between a point a and a point M, the expanded contour trajectory of the obstacle between the point M and the point N, a second arc-shaped trajectory between the point N and the point D, the trajectories at the point M and the point N are discontinuous, the included angle between the two front and rear trajectories is relatively large, and the robot needs to reduce the speed to a low value at the point M and the point N to perform a turning operation, which greatly reduces the operation efficiency of the robot, so that a trajectory smoothing method needs to be used for smoothing, as shown in fig. 3D, the u-turn trajectory composed of a-M-N-D is changed into a-M after smoothing processing, and then the a-M-N-D trajectory is changed into a-’-N’-D。
By the process shown in fig. 3A, the robot does not need to pause and turn in place in the whole process in the turning process through the process shown in fig. 3A, and the operation efficiency of the robot is high.
Fig. 4A is a flowchart of another method for controlling turning around of a robot according to an exemplary embodiment of the present application, and based on the embodiments shown in fig. 2A to fig. 3A, when the robot detects an obstacle affecting a turning path during driving, the present embodiment takes how to optimize the turning path as an example. In this embodiment, the turning path includes a first arc-shaped trajectory between the first turning point and the first middle point, a first straight-line trajectory between the first middle point and the second middle point, and a second arc-shaped trajectory between the second middle point and the second turning point.
As shown in fig. 4A, the robot turning-around control method further includes the steps of:
step 401: and if the first turning point, the first middle point, the second middle point and the second turning point are within the visual range of a camera arranged on the robot, determining the visual area of the robot according to the visual range of the camera and the positions of the obstacles contained in the visual range.
The robot judges whether the first turning point, the first middle point, the second middle point and the second turning point are included in a visible range of the camera in real time in the process of running along the first straight-line navigation line, and if yes, a visible area of the robot is determined. The robot can acquire obstacles contained in a visual range from the constructed environment map.
As shown in fig. 4B, when an obstacle is included in the visible range of the camera, the area behind the obstacle in the visible range is blocked to become the invisible area of the robot, and the area that is not blocked in the visible range is the current visible area of the robot.
Step 402: and judging whether a point out of the visible area exists in the first turning point, the first middle point, the second middle point and the second turning point, if so, executing the step 403, and if not, ending the current flow.
Step 403: optimizing a position of at least one of the first turning point, the first intermediate point, the second intermediate point, and the second turning point such that a point outside the visible area falls within the visible area of the robot before the robot arrives.
In an embodiment, a target point may be selected from the first turning point, the first intermediate point, the second intermediate point, and the second turning point for position optimization based on a point outside the visibility region.
The optimization process of the turning route is described in four cases as follows:
in the first case: the first turning point being outside the visible region
Selecting the first turning point as a target point for position optimization, wherein the optimization mode can be as follows: the position of the first turning point is updated by increasing the distance between the first turning point and the current position of the robot. Wherein the increased distance is less than the preset distance S4.
As shown in fig. 4C, the visible area of the robot is a shaded portion, when the point a (i.e., the first turning point) is located behind the obstacle, the point a is never seen during the travel of the robot on the first straight line, and at this time, the point a can be moved upward by increasing the distance between the point a and the current position of the robot, so that the robot can ensure that the point a falls within the visible area of the robot before reaching the point a around the obstacle, and as shown in fig. 4C, the point a is moved upward to the intersection position of the first straight line and the first straight line trajectory, that is, the point a is moved upward’And (4) point. Of course, the upward moving distance may also be determined according to the newly added visible region during the process of the robot moving around the obstacle.
In the second case: the first intermediate point is outside the visible region
The first mode is as follows: and selecting the first turning point as a target point for position optimization, namely updating the position of the first turning point by reducing the distance between the first turning point and the current position of the robot.
The second way is: the first intermediate point is selected as the target point for position optimization, i.e., the position of the first intermediate point is updated by increasing or decreasing the length S3 of the first straight-line trajectory.
As shown in fig. 4D, for the first mode, the point a is moved downward to reduce the distance between the point a and the current position of the robot, so that the point B can fall into the visible region as soon as possible before the robot reaches the point B, and the distance for moving the point a downward can be preset according to practical experience; the second way is to move point B out of the blind area by moving point B to the left or moving point B to the right, so that the point B falls into the visible area as much as possible, and move point B to the left in FIG. 4D’Dots, so that they fall within the visible region.
In the third case: the second intermediate point being outside the visible region
The first mode is as follows: selecting the first intermediate point as a target point for position optimization, namely updating the position of the first intermediate point by increasing the length S3 of the first straight line track;
the second way is: the second intermediate point is selected as the target point for position optimization, i.e., the position of the second intermediate point is updated by increasing or decreasing the length S3 of the first linear trajectory.
Wherein, as shown in fig. 4E, the first way is to make the point C fall into the visible area as early as possible before the robot reaches the point C by moving the point B to the left, as shown in fig. 4E, and move the point B to the left in fig. 4E’Point C can fall in the visible area as early as possible when the robot turns to the BC straight line track; the second way is to move point C out of the blind zone by moving it to the left so that it falls as far as possible within the visible area.
In a fourth case: the second turning point is outside the visible region
The first mode is as follows: selecting the first intermediate point as a target point for position optimization, namely updating the position of the first intermediate point by increasing the length S3 of the first straight line track;
the second way is: selecting a second intermediate point as a target point to perform position optimization, namely updating the position of the second intermediate point by reducing the length of the first linear track;
the third mode is as follows: and selecting the second turning point as a target point for position optimization, namely updating the position of the second turning point by increasing the longitudinal distance between the second turning point and the current position of the robot, wherein the increased distance is less than the preset distance S4.
Wherein, as shown in fig. 4F, the first way is to make the D point fall into the visible area as far as possible before the robot reaches the D point by moving the B point to the left; for the second way, point C is moved to the left, so that point D can fall into the visible region as early as possible, e.g., point C is moved to the left of point C in FIG. 4F’Point, so that the D point can fall in the visible area as early as possible; a third way is to move point D out of the blind zone by moving point D up.
It should be noted that the above four cases are only examples, and the application is not limited to adopt other optimization methods. In addition, for the combination of the four situations, the optimization can be selected for four points according to the optimization principle.
Based on the optimization process, as long as the position of any one of the four points on the turning route changes, the derivatives on the two tracks to which the turning route belongs may have different phenomena to become an inflection point, so that the robot can be stuck when moving to the inflection point, and the operation efficiency is reduced. Therefore, after the optimization updating, the uneven inflection points can be smoothed by adopting a track smoothing method.
So far, the flow shown in fig. 4A is completed, and through the flow shown in fig. 4A, the obstacle information can be fused into the turning route optimization, the visible area of the robot is obtained according to the visible range of the obstacle and the camera, and if a turning point is blocked by the obstacle and does not fall within the visible area, the position optimization can be performed on any point of the four turning points, so as to reduce the possibility of collision with the obstacle as much as possible.
Fig. 5 is a structural diagram of an embodiment of a robot u-turn control device according to an exemplary embodiment of the present application, where the robot u-turn control device includes:
a detection module 510 for detecting an obstacle on a first straight course;
a processor 520, configured to determine a distance S1 between the current position and the obstacle when the obstacle is detected on the first straight line; determining a turning route turning from the first straight travel route to the second straight travel route according to the distance S1 and the route distance S2 between the first straight travel route and the second straight travel route; the turning route is composed of an arc formed by a first turning point on a first straight travel route and a second turning point on a second straight travel route, the derivative of the first turning point on the first straight travel route is the same as that of the turning route, and the derivative of the second turning point on the second straight travel route is the same as that of the turning route.
In an alternative implementation, the turning route includes: a first curved trajectory between the first turning point and a first intermediate point, a first linear trajectory between the first intermediate point and a second intermediate point, and a second curved trajectory between the second intermediate point and the second turning point; the derivative of the first turning point on the first straight line and the derivative of the first turning point on the first arc-shaped track are the same, the derivative of the first middle point on the first arc-shaped track and the derivative of the first middle point on the first straight line and the derivative of the second middle point on the second arc-shaped track are the same, and the derivative of the second turning point on the second straight line and the derivative of the second turning point on the second arc-shaped track are the same.
In an alternative implementation, the distance between the first intermediate point and the second intermediate point is a preset distance S3; the positions of the first intermediate point and the second intermediate point are determined according to the distance S1, the lane spacing S2 and a preset distance S3; the center point between the first middle point and the second middle point is located on a center line between the first straight travel route and the second straight travel route, and the first straight track is perpendicular to the first straight travel route and the second straight travel route.
In an alternative implementation, the longitudinal distance between the first turning point and the first intermediate point and the longitudinal distance between the second turning point and the second intermediate point are both preset distances S4; the positions of the first turning point and the second turning point are determined according to the distance S1, the lane spacing S2 and a preset distance S4.
Fig. 6 is a structural diagram of another embodiment of a robot u-turn control device according to an exemplary embodiment of the present application, and based on the device structure shown in fig. 5, the robot u-turn control device further includes:
and a power driving module 530 for controlling the robot to travel along a first straight travel route, and controlling the robot to travel along a turning route to turn from the first straight travel route to a second straight travel route when a first turning point on the turning route determined by the apparatus of fig. 5 is reached.
In an alternative implementation, the robot driving along the first straight course and driving along the turning route upon reaching the first turning point is performed when the robot does not detect an obstacle affecting the turning route during driving.
In an alternative implementation, the turning route includes: a first curved trajectory between the first turning point and a first intermediate point, a first linear trajectory between the first intermediate point and a second intermediate point, and a second curved trajectory between the second intermediate point and a second turning point; the device further comprises (not shown in fig. 6):
the path optimization module is used for determining a visible area of the robot according to the visible range of a camera and the position of the obstacle contained in the visible range if the first turning point, the first middle point, the second middle point and the second turning point are in the visible range of the camera arranged on the robot when the robot detects the obstacle influencing the turning route in the driving process; judging whether a point outside the visible region exists in the first turning point, the first middle point, the second middle point and the second turning point; if so, performing position optimization on at least one point of the first turning point, the first middle point, the second middle point and the second turning point so that the point outside the visible area falls into the visible area of the robot before the robot arrives.
In an alternative implementation, the apparatus further comprises (not shown in fig. 6):
the obstacle avoidance module is used for acquiring obstacles around the robot in the air robot driving process and expanding the outline of the obstacles according to a preset size to obtain a target outline; and if the target contour and the running route of the robot have two intersection points, updating the running route of the robot according to the two intersection points.
In an optional implementation manner, the obstacle avoidance module is specifically configured to, in a process of updating a driving route of the robot according to the two intersection points, replace the driving route of the robot between the two intersection points with a section of arc on the target contour, the section of arc being composed of the two intersection points.
The implementation process of the functions and actions of each unit in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the scope of protection of the present application.