Firefighting Drone

Firefighting Drone

This project was done in collaboration with North Carolina State University, Northrup Grumman, and Army Research Labs with advisory support from municipal firefighters in the Research Triangle Park region. 

I’m impressed by how much Alwyn’s team accomplished in such a short amount of time. These guys can compete with any world-class research and development team.

– Dr. Larry M. Silverberg, Professor, Mechanical and Aerospace Engineering

Project Description 

Figure 1: Just one iteration of the research prototype

In partnership with local municipal firefighters and academic researchers, we developed and tested a semi-autonomous companion drone capable of indoor search-and-rescue operations with machine vision. This project included development and testing of a novel thrust configuration, custom development of UI/UX, hardware development, firmware development, software development, embedded systems design, PCB design, feedback control systems, and sensor data acquisition.

Firefighters have approximately 10 minutes to search a burning building for survivors and must put themselves at considerable risk to rescue survivors, if any. Visibility is often severely impaired during these situations, and firefighters often crawl on their arms and knees to see under the smoke and to look for survivors. In addition, as the fire progresses, floors and ceilings can weaken and become a deadly, but unseen hazard. Our goal was to build a remotely piloted aircraft capable of assisting firefighters in smoke-filled buildings.

An effective solution must be low-cost, easy-to-pilot indoors without much training, and assist in finding survivors and hazards in smoke-filled rooms.

What We Did

Figure 2: One of many custom PCBs that were designed and produced during the course of the project

Over the course of 18 months, our multidisciplinary team developed a drone system that is suitable for rapid iteration and testing of various sensors, control schemes and payloads so that research on this topic can advance on a reliable platform.

Our team was comprised of expert engineers with specialties in each of the following disciplines:

  • Mechanical Engineering
  • Aerospace Engineering
  • Electrical Engineering (Software)
  • Electrical Engineering (Hardware)
  • Robotics and control systems
  • Drone Piloting

This project involved most of our core competencies. In the process of developing these systems, we used much of our expertise, including:

  • CAD design and simulation
  • Simulation in virtual environments
  • Parts design and fabrication, including 3D printing
  • Custom PCB design, fabrication and testing
  • Software development
  • UI/UX Development
  • Digital signal processing
  • Feedback control systems
  • Troubleshooting
  • Failure analysis
  • Project Management

For this project, we designed and built most of the major systems and subsystems, including:

Airframe – comprised of the motors, propellers, enclosures, and appurtenances required for a flyable drone that can lift the payload of sensors, receivers, transmitters and other electronics required to meet the project objectives.

Piloting – comprised of the radio transmitter and receiver, and the flight controller that allow the pilot to direct the drone’s movement. The piloting system also includes feedback from the Pilot Assist Control System and its display.

Figure 3: “PIlot Override” Control Scheme, with hypothetical obstacle in northeast position

Collision Avoidance Sensing – comprised of a number of different sensors whose measurements were processed and used to detect obstacles and passed to the Pilot Assist Control Program

Pilot Assist Control (PAC) – this system received raw pilot input from the piloting system and data from the collision avoidance sensors, then modifies control output to assist the pilot by avoiding collisions and maintaining stability.

Figure 4: Control Flow Diagram for the Pilot Assisst Control System

Telemetry – This system captured data transmitted over Wi-Fi to create a Heads-Up Display (HUD) that could inform the pilot of the corrections being made to the aircraft as well as to indicate sensor data. The HUD displays the real-time IMU accelerometer data, drone velocity, and drone height all in the form of time-based scatter plots with dynamically ranging axes. By

the end of the project the telemetry GUI was showing the following:

  • Overhead map generated by the 360 points of the lidar
    • Option to show the ultrasonic cones instead of the map
  • “Stick-position” for both sticks of the remote,
    • showing actual stick position from the pilot
    • showing “corrected” position from the pilot assist control system
    • Actual numeric values for throttle, yaw, roll and pitch were also printed
  • All 10 ultrasonic distances (later dropped after abandoning the ultrasonic option)
  • Top/Down distance and respective correction amounts
  • Graphs of Roll/Pitch/Yaw
Figure 5: Screen Capture of the Telemetry HUD in action

This was a great project and we’re very proud of it. We worked very hard, had a lot of fun and really exercised our entire engineering skillset.  Of course, every research and development project has its challenges and this one was no exception. We crashed and rebuilt our test drones dozens of times. We developed and tested quite a few variations for most of the major subsystems.  Testing is one of the most hazardous portions of the project, not just in terms of risk of harm to people but also the complete loss of a vehicle and all the work that went into it.    In addition, numerous things occur simultaneously during a flight-test and it is hard to keep track of what actually happened in real-time. Therefore, it is important that steps be taken to mitigate the programmatic risks involved with flight testing.