Skip to main content

UC San Diego Students Demonstrate Smart Camera Trap at New Engineering Competition

By:

  • Doug Ramsey

Published Date

By:

  • Doug Ramsey

Share This:

Article Content

Camera Trap

Sentinel team captain Riley Yeakle (left) and Chris Ward are electrical engineering majors who competed with fellow team members Perry Naughton and Kyle Johnson (not pictured) for the Cornell Cup USA, presented by Intel. [Photos by Alex Matthews/Calit2 UC San Diego]

Forget about building a better mouse trap. University of California, San Diego sophomore Riley Yeakle and his teammates have come up with a better camera trap, and they faced off with finalists from around the country when they unveiled working prototypes of their visions for embedded systems at a new, national engineering student competition. While falling short of the top three, the UC San Diego as well as UC Berkeley teams were awarded Honorable Mentions for their innovations.

Embedded systems are computer systems designed and built for specific tasks. The UC San Diego undergraduates demonstrated their Sentinel intelligent camera trap system at the first annual Cornell Cup USA competition, presented by Intel, May 4-5 at Walt Disney World in Florida. It is one of 22 collegiate teams selected as finalists to demonstrate their engineering projects in embedded system design and development.

The judges were looking at "how well we tackled the project, the robustness of the process, how we met specific technical challenges and limitations of existing camera traps, and how we measure our performance," says Yeakle, the team captain from the Electrical and Computer Engineering (ECE) department of UCSD’s Jacobs School of Engineering. "The rules make it clear that they [were] looking for the team that makes the best use of its time and resources to meet a specific need."

Yeakle and fellow ECE undergraduates Perry Naughton (fourth year), Kyle Johnson and Chris Ward (both third-year undergrads) benefited from direct access to National Geographic Society explorers and engineers through the UCSD-National Geographic Engineers for Exploration program. Led by Albert Lin, a research scientist in the UCSD division of the California Institute for Telecommunications and Information Technology (Calit2), the program puts teams of students to work on technologies that could eventually be deployed in the field.

“Our top students from a variety of disciplines develop world-class engineering solutions for challenges in exploration,” says Lin, a three-time UC San Diego alumnus (Ph.D. in materials science ’08, M.S. in electrical engineering ’06, ’04). “The concept for the computer vision-based, tracking camera trap was proposed and developed by the students, and their excitement has been contagious.”

The UC San Diego team has worked long hours since last November, after deciding to compete in the Cornell Cup USA competition.

“We talked to our colleagues at National Geographic about the shortcomings of current camera-trap systems and what they’d like to be able to do with traps that cannot be done with what’s available commercially today,” explains Yeakle. “National Geographic helped us focus in on the computer vision and object tracking, since other systems do not yet offer these features.”

Engineers for Exploration student video team produced this video showcasing the Sentinel camera trap's development since November 2011. Click on image to watch.

“They expressed interest in being able to track animals, and that’s what convinced us to take advantage of camera vision to pinpoint and track animals automatically and remotely – capturing their movements in the wild with no explorer, photographer or scientist present.”

The new, improved camera trap introduces a cascade of low-power piezoelectric vibration sensors placed around the camera turret. The piezoelectric sensors convert ground movement into voltage, so if an animal comes close enough to trip a sensor (usually within a radius of three feet from the sensor), the turret automatically swings around to face in the direction of the tripped sensor.

Simultaneously, an Intel Atom processor begins to run a computer-vision algorithm looking for ‘blobs’ of color (groups of similarly colored pixels). A large, tan-colored blob, for example, may indicate a lion. If so, the camera can lock on to the moving blob and begin tracking it.

“Our object-tracking algorithm sends instructions to the stepper motors of the turret to keep the object in the center of the camera frame,” explains Yeakle. “This allows the camera to capture high-quality video of a moving animal, as well as an extended opportunity to take photographs after a regular camera trap may no longer ‘see’ the animal because it has disappeared out of frame.”

“Getting access to the Intel Atom processor was a major factor in our decision to go the distance and integrate computer-vision techniques,” adds teammate Perry Naughton. “The Atom has good processing power, and computer vision is processing intensive.”

In addition to the Atom processor and travel expense money from Intel, the team received a donation of power supplies and lab equipment from Tektronix, and software from Mathworks. Roughly $600 in supplies to build the camera trap was supplied by National Geographic through the Engineers for Exploration program.

The resulting system has 360-degree coverage, so the digital single-lens-reflex (D-SLR) camera in the turret could track and record an animal walking in circles around the camera.

“Current camera traps have a limited field of view and they often are triggered mistakenly because the camera records the scene even if there is no animal there,” says Yeakle. “Our design overcomes those problems, and our ultimate goal is to produce a robust, intelligent, autonomous and low-power system that can be deployed in different environments.”

National Geographic has already expressed interest in the technology for future deployment to capture video and photos of wildlife in the field.

There are two key challenges to overcome before the Sentinel can provide remotely recorded, broadcast-quality video of animals for National Geographic Channel documentaries. In its current implementation, the object tracking is slow and jerky. The team is exploring the use of a proportional-integral-derivative (PID) controller, similar to the generic control loop feedback mechanisms used in industrial control systems. The PID controller should smooth the motion of the stepper motors, which currently race to the most current coordinates they are given. The researchers are also testing a Kalman Filter, to improve the guesswork in determining the highest-probable next location for the center of the detected object.

Camera Trap

Components of the camera trap system include 1) wireless, piezoelectric vibration sensors, 2) camera and turret, and 3) inner workings, including Intel Atom processor and stepper motors.

“I am incredibly proud of the synergistic energy of this dynamic team, made up of students who define the best in interdisciplinary collaboration, initiative, and leadership,” beams Calit2’s Lin, who is also a National Geographic Emerging Explorer. “The methods that they have created will have real impact in exploration, and they highlight the power of engineering for real-world applications.”

Based upon the highly successful Intel Cup China that attracts over 26,000 students, the Cornell Cup USA was designed to provide an exciting exposition that invites students’ imaginations to construct any design they can dream up and create as the next great embedded technology invention. The top three winning teams shared in $10,000, $5,000 and $2,500 prizes, respectively. The competition is organized by David Schneider of Cornell’s Systems Engineering program; and Byron Gillespie and Kimberly Sills of Intel.

Three of the 22 finalists for the inaugural Cornell Cup USA are based in California. UC San Diego’s Sentinel and UC Berkeley’s Solar Drone UAV are the only University of California projects to make the grade, as did the University of Southern California’s team building VISIONary, an indoor navigation system for people who are visually impaired.

Intel travel support allowed all four members of the UC San Diego team to attend the competition. Also on hand: team advisor Ryan Kastner, a professor of Computer Science and Engineering (CSE) in the Jacobs School of Engineering. Kastner teaches CSE 145, a course on embedded systems, and co-directs the Engineers for Exploration program. He is currently on sabbatical in Washington, D.C. at National Geographic.

Camera Trap

Members of the Sentinel camera trap team are ECE undergraduates (l-r) Kyle Johnson, Chris Ward, Riley Yeakle and Perry Naughton.

“The UC San Diego team is dealing with a set of demands that goes beyond those required for the contest, because they are building the camera trap for National Geographic too,” says Kastner.

In particular, he adds, they must make it rugged, allowing it to be set up in the wild for long periods of time. It must be developed so that scientists and explorers without an engineering background can deploy it, and as much as possible, it must be made so that the trap can easily be changed to target different animals across a wide variety of environments. The team has addressed these challenges by making them into design requirements as advocated by the contest.

“The contest itself is not only about making the best project, but the journey to make the project -- creating the correct specifications and design requirements based upon the application, and meeting or exceeding those requirements,” observes Kastner. “The students have done a great job in building Sentinel as an intelligently-triggered video trap that senses and sees its environment in order to capture high-definition video of all animals in its surroundings. Observing and documenting the behavior of elusive and endangered species will one day be easier, cheaper and more robust, thanks to the students’ ingenuity, teamwork, and our one-of-a-kind partnership with National Geographic.”

In the run-up to the competition in Florida, the UC San Diego team first tested the camera trap on a red box moving around the camera, carried by hand. Then it was time to test their system ‘in the wild’, i.e., in a backyard with several dogs. So far, so good.

Share This:

Category navigation with Social links