While the U-M team’s robot used a laser sensor to scan 40,000 points per second as it navigated an environment, SkySpecs’ production model will rely on sophisticated cameras instead.
“We’re really focusing on the visual side, basically using the cameras the way a human does,” Ellis said.
The robot’s modular design features a circular center platform from which four rotor-bearing arms extend. A number of specialized cameras or sensors can be installed interchangeably on top of the platform.
“Our biggest goal is anyone can take it out of the box and fly it, just completely untrained,” he said.
The robot will not require any programming by users. Instead, they will interact with it via a touchscreen tablet.
“This will have high level commands and the ability to place way-points in the environment that the vehicle will follow,” he said.
The high-level commands will vary with market and application, but may include take off, land, return to home, hold position, explore environment and more.
“The learning curve to operate our vehicle will be minimal,” he said.
Ellis said “no tools are necessary.” In fact, he said an engineer or soldier could remove the arms or exchange the sensors without even taking her or his gloves off.
“It can go in a harsh environment, be waterproof, run into things and bounce off and be fine,” he said.<< previous 1 2 3 4 next >>
Comments are closed