Localization and Control of the Quadrotor Using IMU and Mono Camera RIOS11 International RoboCup IRAN OPEN 2011-Libre

March 26, 2018 | Author: nightpianist | Category: Inertial Navigation System, Accelerometer, Kalman Filter, Physics & Mathematics, Physics


Comments



Description

Localization and Conontrol of the Quadrotor using IMU U and a Mono Camera Vahid Khorani, Alireza Mohamma mad Shahri, Arash Shirkhorshidi, Seyyed Mohammad ad Reza Mirfakhar Abstract—This paper proposes a method to use vision aided inertial navigation system (V-INS) on a quadr drotor flying robot. The proposed method uses a light mono came mera to detect fixed land-marks and calculating their distances fr from the robot and using this information to calculate the positio tion of robot. Then, these observations are fused with a lo low cost inertial measurement unit (IMU) using a Kalman-filte ilter (K-F) to obtain more precise and faster information about the he robot position. I. INTRODUCTION R stems (INSs) have ECENTLY , inertial navigation syste been used for estimating position off vehicles such as airplanes, helicopters, automobiles, etc. [1] [1]. At the core of most INS lies an inertial measurementt unit (IMU) that measures linear accelerations and rotation onal velocities. By integrating these signals in real time, an IN INS is capable of tracking the position, velocity, and attitudee oof a vehicle. This dead reckoning process, however, canno not be used over extended periods of time because the errors rs in the computed estimates continuously increase. This is due ue to the noise and biases present in the inertial measurements. ts. For this reason, current INSs rely on an absolute sensor like ke GPS or camera in order to receive periodic corrections.. IIn most cases, a Kalman filter (K-F) estimator is used ed for optimally combining the IMU and GPS mea easurements [2]. Nevertheless, during GPS signal outages for indoor uses, standalone IMU, specifically low-cost, ssuffers from one serious limitation: errors in the estimatedd navigation states rapidly accumulate with time due to thee highly nonlinear drift of the inertial sensors’ measurements [3]. Using stereo camera to calculate depth of image and es estimating position of robot, is another method for navigatin ting robot [4] for indoor situations. In this case, vision techni niques can be used to calculate observations. High price, heavy avy camera set and needing powerful processors to process thee images are some difficulties that limit using this kind of ccameras in flying robots. In case of navigation on quadrotor which ich is a light flying Vahid Khorani, is with the Mechatronic Research rch Laboratory (MRL), Islamic Azad University, Qazvin Branch ch, Iran. (e-mail: [email protected] ) Alireza Mohammad Shahri, is an assistant pro professor at Electrical Engineering department of Iran University of Scien ience and Technology, IUST, Tehran, Iran (e-mail: [email protected] ) Arash Shirkhorshidi, is with the Mechatronic R Research Laboratory (MRL), Islamic Azad University, Qazvin Bran ranch, Iran. (e-mail: [email protected] ) Mechatronic Research Seyyed Mohammad Reza Mirfakhar, is with the M Laboratory (MRL), Islamic Azad University, Qazvinn B Branch, Iran. (e-mail: [email protected] ) robot there are some limitations in using u equipments because of their weights. Recently, GPS aided aid INSs are developed for using on quadrotors for naviga igation [5]. For quadrotor indoor fly navigations, light and d reliable techniques are needed which uses low power proces cessors. In this paper a new method iss developed to use mono camera aided INS for navigating a quadrotor flying robot. The proposed method uses a fixed fix place land-mark to estimate position of robot for indoo oor flies. A K-F is used to fuse the IMU and camera observati ations. This method uses a light mono camera, a light weak pro rocessor and a MEMS low cost IMU which are suitable choic oices for using on a light quadrotor. II. USING IMU TO CALCUL ULATE POSITION In case of inertial navigation for fo a flying robot, Euler angles will be calculated using ng a three dimensional gyroscope set. The angles calculated ted will be corrected using a three dimensional accelerometerr and a three dimensional compass sensor set. After that, thee angles a are used to obtain rotation matrix. Finally it is possible ble to calculate position of robot using rotation matrix and nd a three dimensional accelerometer set. The position calcu lculated in this section will be corrected using position earned from fr camera. Tow frames are used, the first frame fra named ‘body-frame’ is rigidly connected to the center of gravity of quadrotor and the second frame named ‘earth-fram ame’ is fixed to the earth. Fig. 1 shows earth-frame and body-fframe are used in inertial navigation. Fig. 1: earth-frame and d body-frame b Euler angles are calculated using g equation e (1) [6]: (1) where , and are ZYX-Euler angles an and , and At this project a set of methods is introduced and used to calculate the accurate estimation of object position. Integrating accelerations of robot in the earth-frame obtains robot position. 0))(- -K ) 5 (0 0K )6 ) (6) where J65 is the central moment. CAMERA AND VISION TECHNIQUES In this project a UEye mono camera with TV lens is used to detect color land-marks. After that. First all moments up to third order of a polygon shape must be calculated [7]. + - 0 ≥0 ≥0 1 +≤3 (4) where . complicated situations that land-mark isn’t in the center of image will be studied. But there is a problem with RGB and it is discrete distribution of colors that can make a lot of problems during processing colored images.are rate of angles measured using three gyroscopes. 0)-5 06 ) (5) where 4+ is spatial moment and . USING CAMERA TO CALCULATE POSITION In this project land-marks are circles by diameter of 30cm. Now by having and +. so the colors are classified to simpler mode compared to the RGB. Color Spaces Conversions The first thing in any image processing software application is converting color spaces to each other. III.(-. The moments may be used then to calculate gravity center of the shape. distance between camera and land-mark will be calculated when land-mark is supposed to be in the center of image and orthogonal compared to the camera. Acceleration measured using accelerometers will be transferred to the earth-frame: "#$ #% #& '( ) "# * #* # * '( (3) where # * . the object borders positions must be calculated.(-. The positions which are needed are: -Maximum right top X -Maximum right top Y -Minimum left down X -Minimum left down Y By having this positions the accurate distance between camera and Land-Mark can be calculated using equations introduced in following parts. 0) is the intensity of the pixel (-.% (.then some spatial moments retrieved. . A. #% and #& are robot accelerations in the earth-frame. 0). Detecting Color Land-marks The Basic level of this project is based on the Vision technology and its ability to find objects and land-marks. spatial moments from above equations are retrieved: 456 78$. the central moments are calculated: J65 78$. Then. To omitting these errors an absolute position measurement is needed to correct the estimates during the time. Rotation matrix is obtained using Euler angles [6]: ! (2) where is rotation matrix that is used to transform the measured accelerations from body-frame to earth-frame. After all. C. main axis and various shape characteristics. At this project a mono camera is used to find color land-marks due to calculate distance of the robot from detected landmark which has been done by detecting blobs in the captured image. IV. By doing this two moments and + are achieved: . This position will be diverged during the time because of odometric errors. In this paper. Estimating object position: After detecting the object it is time of keep track and holding the position of object in image frame. # * and # * are accelerations measured using three accelerometers and #$ .% (. its area. The solution is HSL color space which rearranges the geometry of RGB in an attempt to be more perceptually relevant than the Cartesian representation.and 0 are order of the retrieved moments. At first. The default color space in the most cameras is RGB which is stupendously fast to compute since any conversion to and from it can be done by using minimum system resources. a mono-camera is used to obtain robot position and correct the inertial navigation results.(-. B. -K and 0K are coordinates of the gravity center of the shape and can be calculated using equations below: Z-K X 4[\ ]4 \\ (7) Y 4 \[ X 0K ]4 \\ W Then position of the object can be achieved in the image using following formulas: 4[\ Z^ J\\ X (8) Y 4\[ X_ W J\\ Where ^ and _ show the position of the object in the image. 5 using distance between b center of camera and land-mark. in the new situation it assumes that has angle a : a do (11) (13) Fig. position of land-mark in image-fram ame is showed by ^o 6 and _o 6 and distance between center of o camera and center of image-frame is showed by 1. 3 Angle gh gh is obtained using equation bbelow: i b jk l b b ^ m inn g br (10) If the camera has had an angle of in dir direction of yaw. Calculating Position As seen in Fig. 2) it is seen a circle with a certain diameter w which is used to calculate its distance from the center of cam mera. Calculating the Distance between Camera era and LandMark when Land-Mark is in the center of im image and orthogonal compared to the image When the land-mark is exactly in the cen enter of image and is orthogonal compared to the camera (as se seen in Fig. b is radi adius of land-mark. a 1o b j v w[ vx b f b b dea q iy y dea zzr b b f b q B. 06 and p6 . Calculating the Distance between Camera era and LandMark when it is not in the center of Image Land-mark that is not in the center of im image (Fig. In I this situation the landmark is not orthogonal compared d to the camera and it is needful to calculate its real diameter er. 0K and pK is named Camera-Frame which is in the center ter of camera. In this figure fig center of image is labeled using image-frame which contains co axis -6 . _o 6 and 1 are calculated using ng equations below: . Fig. ^( and _( are sizes of image sides and b shows number of pixels. 3) can be assumed a land-mark in the center of a ccamera which has angle g h in direction of yaw. 4 the red land-maark is seen in the camera and it is possible to measure diamet eter of the red land-mark. 2 In Fig. distance between land-m mark and camera can be obtained using equation below: (9) A. 4. distance between cen enter of camera and center of image is calculated. dea and ` a are camera specifications. Distance between center of camera era and land-mark obtains using equation below: ^( c tan 1 tan dea b dea b b f b q As seen in Fig. In the new situation relation bet etween camera and landmark can be showed using Fig. frame that is made up from axis xis -K .. 2. 5 ^o 6 . Fig. Relation between diameter of thee real land-mark and the black one can be represented using equation e below: cb s t cb u bs t b a (12) Finally. 4 Fig.A. • b 1o ~ o 0. Vision technique is not enough fast. Error process model is [10]: Fig. €6K ‚o (‡ˆ\) ‚Ko (‡ˆ\) (18) Finally position of camera during the mootions is obtained using equation below: -(•) •(•) -(•) ‘(•) (21) where ‘(•) is the system noise and a •(•) is the dynamic matrix [10]. Absolute sensors are used as observation ons to estimate inertial navigation system’s errors using Kal alman filter.•n o b 0. it is used to t determine an approach to control robot orientation and an position using the information gathered by navigation system. So position of land-mark before robot’s motio tions is calculated using equation below: ‚o (‡ˆ\) "‹^. the camera-frame is located in the he center of earthframe and its orientation is the same as earth-frame..•y } tan~ ` a k0.‡w[ -‡w[ ”‡ (22) where -‡ is the state parameter vector. Differences between positions. ‹• • . -Š • lK (‡)) ‚Ko (‡) V. the basics of thee quadrotor modeling are introduced and after that. ‚K showss position of quadrotor in the earth-frame. QUADROTOR POSITION AND D ATTITUDE CONTROL In this part. 6 l6K 0 (19) By assuming that camera-fram ame and body-frame of quadrotor are the same. Inertial navigation system errorr model has 15 states as seen below [10]: Homogeneous transform matrix betweenn image-frame and camera-frame is obtained using equation bel elow: l6K ‚K6 ƒ i 0 i „ 0 0 0 0 i 0 ‚K6 i 0 0 0 1 0 ‰ 0 i (15) where is rotation matrix and is posit sition of the center of the image-frame in camera-frame. Using equations (15) position of land-mark ark is calculated in camera-frame: ^o 6 K K _ (16) ‚o l6 … o 6 † ‚K6 0 Rotation matrix between camera-frame aand earth-frame is obtained using equation (2): lK (17) In this project it is assumed that before ore any motion of robot. ‹Ž . ‹ ' (20) where consists of positions. ‹• . s .• k • b b o n ‚K (‡) { ~{ (14) z Two kinds of sensors are used in this project to complete instrumentation part using sensor fusion fu techniques.‡w[ is the he state transition matrix. ‹ . VI. camera-frame me and earth-frame (showed in Fig. ‹ . The Kalman filter estimates inertial ine navigation system errors during the time using equation ions (22) and (23) [11]. 6) are studied in this sec ection to calculate position of camera in earth-frame.‹ . velocitie ities and attitude errors and 3-axis gyro and accelerometer drifts. This measurement error vector has rela lation to the error model states as seen in equation below: p‡ “‡ -‡ •‡ (23) where p‡ is measurement error vect ctor that is obtained using observations. “‡ is observation maatrix. but it has odometric errors that divergee during the time. ‹• . velocities and attitudes measured using absolute sensorss and inertial navigation system obtains a measurement ent error vector. ve ”‡ is the process model noise vector and ’‡. ‹Ž . Inertial navigation system is faster compared red to the vision part. ‹_. USING KALMAN FILTER TO ESTIMATE ST IMU ERRORS Three frames image-frame. but itt is absolute sensor and has not odometric errors. ts. ‹Œ. ‹Ž .Z X X1 X X X Y X^o 6 X X X X_ o W 6 { { |i } tan~ c1 tan c1 tan dea k b dea x b b b o ` a v0. and •‡ is noise of measurement model. The discrete state equation is: -‡ ’‡. 9]. ‹ . Other absolute ute sensors that are used in this project are compass sensors and an tilt sensors which use IMU accelerometers to obtain Euler ler angles [8.‹ . › . The first part is about linear accel elerations and the second part is about angular accelerations.B.. Equationn below represents how to exchange MIMO model to four SISO SO models: ¡ [ ¡ ~ ¡ ž ¡ • ¢ ¢ ¢ ¢ ! i i „ i i 0 i 0 i i 0 i 0 i i ‰ i i d £ £ £ £ ¤ ¦ § © ! i ( Ž i ( Ž ¡ ¢ ¢ ( ) ^¡ . desired accelerations. £ ¤ .¤ ))¥ª.§ (’ ¡ ( ( ¢ ¡ ¢ ¡ ¢ ’ ’)¥ª. A PD controller is used to control orientati ations of quadrotor in this project. ^¡ .§ . ¢ ( )_ ¡ ¢ ) . . 7 Dynamic model of quadrotor is describ ribed in two parts [12]. ( )^ ¡ ( ) _¡ ¢ ) . . Equations below represent dynamic model of the quadrotor: 0 … 0 † 8Ž ^• Z _• z 8 v X Œ• X Y X X W . Ÿœ b b (24) where "^• _• Œ• '( is linear acceleration vec vector. . First step in the ccontrolling of the quadrotor is writing a SISO model for it.¤ œ)¥¡. Fig. 9 shows the land-mark used us in this project. £ ¦ and £ § are output of ’. TEST RESU SULTS The described methods were ere tested in a robot implemented in MRL.¦ b)¥¡. Position Control Using dynamic model of quadroto otor introduced in equation (24) roll and pitch angles are calcu culated in order to control the position: Z Y’ ¡ W Fig. •6 is prope peller force.. . 9: land-mark used in this project . (25) where 6¡ ¢ is desired motor speed.. l is the rota tation matrix from body-frame to the earth-frame and 8 is thee mass. ^) ^) ^) (28) the same equation on can be used. ¥¡ (^(a«5 K•ea ¥ª (^(a«5 K•ea ¥6 ¬ (^(a«5 K•ea £ £ ¤ ¦ § (›¡ ¢ (œ ¡ ¢ (b ¡ ¢ ›)¥¡.. 46 is moment produced by rotors.. VII. . A.. ˜œ š . 8: quadrotor platform designed de in MRL Fig. Soo. it is possible to control four motors using four separate con controllers in order to control the robot orientation. and ^ ¡ ¢ and _ ¡ ¢ are desired acceleration and are calculated using ng equation below: . 8 showss the quadrotor which the camera and IMU beside a RB100 [13] [1 PC installed on it for testing the introduced methods.¦ )¥ª. . Three PID controllers used in this projec ject are represented in equation below [12]: £ (27) (26) Fig. PD controller will be abl ble to control the quadrotor successfully using fast control rol loop including motors by fast dynamic and fast sensors rs to measure the orientations accurately.. ¢ where ¡ ¢ and ’ ¡ ¢ are desired ed Euler-angles to reach . Fig.. b ˜ 4[ 0 l–— ˜ 0 š ™• 6 m(•~ •• ) š m(•ž •[ ) 4~ 4ž 4• › › Ÿœ ¨ . For _ ¡ ¢ ¢ . s. and controllers respective ively and d is the hovering speed. ¢ K•ea . Attitude Control In the case of attitude control of the quadr adrotor flying robot the main parameter is speed and accuracy oof measurements. "› œ b'( is angular acceleration vector. ^(a«5 and Œ ¡ .. The position of land-mark was fixed and an it was installed on the wall in height of one meter from the he earth.. path estimated using Kalman Filter 1. path measuerd using camera b. Fig.2 Fig.2 -1 0 -1. 14 focuses on the position curves to show steps of the method. -0. the position measured using mono camera is very slow and its error is high. 15.2 desired position position measured using camera position stimated using K-F finally filtered position 0 -0.2 0 2 4 6 8 10 time (sec) 12 14 16 18 20 Fig.6 0. 13: a.8 0. path measuerd using camera b.2 1 Y (meter) 0.4 0.Fig. 14: a.2 -0.5 time (sec) 17 Fig. path estimated using Kalman Filter 18 20 15 15.8 0.5 20 . 1.6 0. 10: measurements gathered using IMU 0 Fig.2 1 Z (meter) 0.2 0 2 4 6 8 14 16 18 Fig. path estimated using Kalman Filter 17. Kalman filter has used vision observations to update the position calculated using IMU and finally the filtered estimation is an acceptable measurement. 11 to Fig.8 desired position position measured using camera position stimated using K-F finally filtered position 0. path measuerd using camera b.4 0.4 -0.2 0 2 4 6 8 10 time (sec) 12 14 16 Fig. 13 show position estimated using the proposed method.6 -0.5 16 16.6 0.8 Y (meter) X (meter) -0. The estimated path followed by the quadrotor is showed in Fig. path measuerd using camera b. As can be seen in this figure.4 0.2 desired position position measured using camera position stimated using K-F finally filtered position 0 10 12 time (sec) 0. 12: a. 10 shows accelerations and Euler-angles measured using IMU. path estimated using Kalman Filter desired position position measured using camera position stimated using K-F finally filtered position 1 0. 11: a. D. Strachana. Welch.B. G.. Fundamentals of High Accuracy Inertial Navigation Reston.6 [2] 0.I. University of North Carolina at Chapel Hill. R. A. et al. Naim Ajil Foroushan who researches in MRL for quadrotor. 1-14. L. 2001. His support and help is truly appreciated. G.2 1 REFERENCES 0. 839 .. 11(5): p. Bishop. Beetz. Strapdown Inertial Navigation Technology. Weiguang. Rusu. http://www. Roumeliotis. A method for working out the moments of a polygon using an integration technique..8 [1] 0. 519-529. IEEE International Conference on 2008 Hamburg p. A Kalman Filter-Based Algorithm for IMU-Camera Calibration: Observability Analysis and Performance Evaluation. Elsevier. SSEC. The GRASP Multiple Micro-UAV Testbed. Pangercic. Application of Adaptive Kalman Filtering Algorithm in IMU/GPS Integrated Navigation System Geo-spatial Information Science.2 0. 7. 2010.. N. IEEE. in Emerging Technologies and Factory Automation. Allenb. F. K. 17(3): p. D.ACKNOWLEDGMENT Authors gratefully acknowledge Mr. and J.2 1..8 0 [5] -1 -0.845 Wendel. M. path measuerd using camera b.4 -0. 15: a. 15: p. 17: The Institution of Electrical Engineers (IEE). p. [8] [9] [10] [11] [12] [13] Chatfield.com . 13. F. N. Caruso. 12. IEEE TRANSACTIONS ON FUZZY SYSTEMS. 351-354 8. Test results showed that using this method obtains information more precise and faster compared to the only camera..2 [4] 1 0 0. 1990. and G. Editor.. An integrated GPS/MEMS-IMU navigation system for an autonomous helicopter.C. 56-65. 2007.B. An Introduction to the Kalman Filter. Department of Computer Science. Adaptive Fuzzy Prediction of Low-Cost Inertial-Based Positioning Errors. ETFA 2008.6 -0.. 3D-based monocular SLAM for mobile agents navigating in indoor environments. Applications of Magnetic Sensors for Low Cost Compass Systems. et al. Semiconductor. Robotics & Automation Magazine.2 [6] [7] Fig. Honeywell. 1. J. Nesvadbaa. 10. elsevier.roboard.R.2 -0.J. IEEE TRANSACTIONS ON ROBOTICS 2008: p. Tilt Sensing Using Linear Accelerometers..J. and M. and A. Tuck. Titterton. 527-533. 2006: p..4 -0. path estimated using Kalman Filter VIII. The proposed method uses a mono camera and a low cost inertial measurement unit (IMU) to estimate position of robot via a Kalman-filter (K-F). American Institute of Aeronautics and Astronautics. 11. 2008.8 0 0. Michael. 1-7. 10(1).4 0. H..2 -1. 2007.2 [3] 0 desired position position measured using camera position stimated using K-F finally filtered position -0.6 0. CONCLUSION In this paper a method was proposed to estimating the position of a quadrotor flying robot using vision aided inertial navigation system (V-INS). and S. 2007. Weston. et al. 1997. Mirzaei.M. P. 9.
Copyright © 2024 DOKUMEN.SITE Inc.