Automated undersea mapping, swarms of computer-piloted drones and self-driving cars — 20 years ago, devices like these would not be found beyond the pages of an Isaac Asimov novel.
Today, they are the focus of several research labs in the Department of Electrical and Computer Engineering at BYU. A dedicated faculty member heads each lab, steering the research and offering mentorship to undergraduate, graduate and doctoral students. The campus-based projects use cutting-edge hardware and software to solve challenging robotics problems.
The Daily Universe spent some time with two of those groups: the Multiple Agent Intelligent Coordination and Control Lab and the Robotic Vision Lab.
Both groups call the Information Systems and Science wing of the department home. Although the two labs both address challenges related to machine automation and robotic control, the groups take differing approaches and find different applications for their work.
MAGICC
Cammy Peterson, who holds a doctorate in aerospace, is a faculty member conducting research in the MAGICC Lab. She explained 'the big picture' of much of her work and the work of the lab.
'Most of our algorithms are actually just autonomous-systems based,' Peterson said. 'They could be underwater, above-water or ground robots.'
An algorithm is a complex math formula that determines how computer systems — like the ones in MAGICC's robots — make independent decisions. MAGICC develops formulas that control physical robots' movement through space.
'A lot of what I do and teach is on the control side,' Peterson said. 'How do you control the vehicles? How do you do the path-finding ... how do you ensure that they're actually following that path?'
Peterson alluded to dramatic advances in autonomous vehicle technology — drones, in particular — that have developed in the last few decades.
'Some of those early drones, even just a decade a go, you'd try to fly them and they were almost uncontrollable,' Peterson said. 'Any slight wind or movement would just blow them off. And now you can basically take out a drone and they're stable enough that somebody who's never flown (a drone) before can go out and fly them.'
The advancement in drone technology alone 'opens up the possibilities for how we can use them and make the world better,' Peterson noted.
Jaron Ellingson, a Ph.D. student in the Mechanical Engineering Department who works in the MAGICC Lab, hopes to leverage these advancements to build a system of autonomous drone swarms that relies on a decentralized approach.
Ellingson explained that the drones use algorithms to estimate one another's locations and adjust their flight paths accordingly.
He envisions companies such as Amazon or UPS using this system to organize large fleets of independent drones. 'They can broadcast their position ... and other drones can take that position ... and avoid each other.'
The MAGICC Lab relies on custom-made computer code, designed by human programmers and engineers, to govern the movement of autonomous vehicles. The algorithms are fine-tuned to needs and challenges that researchers understand very well. Ellingson and his drone swarm are like a conductor and a symphony orchestra — Ellingson knows what he wants and gives his performers instructions to develop just the right sound.
Robotic Vision Lab
In another part of the of Electrical and Computer Engineering Department, however, students and faculty practice the computing equivalent of free-form jazz.
The Robotic Vision Lab focuses on using artificial intelligence and machine learning to achieve vision in robots. Their research spans self-driving cars, facial recognition and food inspection.
Casey Sun, a Ph.D. student in the lab, explained how Robotic Vision uses machine learning techniques to enable their projects. 'You can collect some clean data in a laboratory setting ... and you try to fit the model to this clean data. The model will be able to learn some patterns in the clean data.'
To achieve robotic vision, the lab uses special computer programs, called neural networks, that can learn to recognize patterns by continuously comparing 'clean' lab-produced examples of data to examples in the real world — so-called 'noisy data.'
A neural network, Sun explained, in essence says, ''I didn't see this pattern before, so I'm going to change (my model).''
Faculty member D.J. Lee, who holds a doctorate in electrical engineering, leads the lab's various projects. He described how he hopes some of the projects in the lab will benefit members of campus and world communities.
'Our facial motion authentication project can improve security and convenience for users. Our visual inspection automation projects improve food safety and food production efficiency. Both will (have a) huge impact on our daily lives,' Lee said.
Looking toward the future
Researchers from both labs expressed their visions for how robotics work will positively affect people's lives.
'I’m hoping that our work in autonomy leads to a world where people don’t need to perform repetitive, dangerous or monotonous work, and can focus on more important and rewarding ventures,' Adam Welker, an undergraduate student working in the MAGICC Lab, said.
'I also think that UAVs can connect us to places. Whether that’s delivering support to isolated rural areas or facilitating transportation in dense cities, drones have a great potential to improve our infrastructure,' Welker said.
Cammy Peterson shared her misgivings as well as her hopes related to robots. 'It depends on the day. I'm always either optimistic or skeptical,' she said.
She cited self-driving cars as an example of a technology that has achieved so much but still has a long way to go. 'It's amazing what they've done, but it's so challenging still. There really is just no substitute for how smart humans are. That's definitely something that AI has helped me to appreciate: stuff we don't even think about is really complicated.'