شهد احمد صبيح

Presently plant diseases in greenhouses impact yield production drastically. Moreover, these diseases are treated uniformly for an entire greenhouse or an entire plant, which in turn increases the chemical content and exposure to farmers and consumers. Therefore, this thesis proposes an autonomous plant disease detection wheeled mobile robot for conventional greenhouses due to the scarcity of necessary infrastructure for agricultural robots. Moreover, to reduce human intervention in greenhouses, that is to perform labor-intensive and hazardous tasks on behalf of farmers. Furthermore, reduce exposure to chemical content in treating diseased plants. These objectives are achieved by designing a robot capable of performing disease detection and navigating greenhouses safely and efficiently. The mobile robot system is further divided into two subsystems: a disease detection system and a navigation system to operate on the designed wheeled mobile robot. Four deep learning algorithms of the state-of-the-art convolutional neural networks namely: [Inceptionnet-v3, ResNet50, Squeezenet1-1, and VGG16] were developed to detect plants and classify plants' and plants' leaves' diseases. These Networks were retrained using plant-village and cotton datasets. The first dataset includes four crops (cherry, grapes, strawberry, and peach) with ten distinct classes of diseased and healthy leaves. Moreover, the cotton dataset contains four classes of diseased and healthy plants and leaves. CNN architectures were retrained utilizing three approaches: shallow transfer learning, deep transfer learning, and training from scratch. Results show that the best performance is 99.908% achieved by the VGG16 network with the ability to detect diseases of multiple leaves in an image. VGG16 is integrated into this system to classify images in real-time from a camera mounted on a 4-degree-of-freedom manipulator that moves using geometric inverse kinematics. The navigation system is mainly distributed to map-based navigation and sensor-based navigation. The former is further divided into offline navigation and online navigation. Modified Generalized Voronoi Diagram roadmap, and Depth-first search traversal method combined to generate a global path to be tracked utilizing pure pursuit path and PID controller. Additionally, Kalman Filter sensor fusion localization is used in to maneuver the robot by controlling the actuators. All are performed while building a sensory map. Furthermore, alongside sensor-based navigation, which depends on sensor reading to navigate safely and accomplish map coverage, a sensory mapping algorithm is considered to build a map and generate a new map to be used in the subsequent runs.

Top