5G Technology World

  • 5G Technology and Engineering
  • FAQs
  • Apps
  • Devices
  • IoT
  • RF
  • Radar
  • Wireless Design
  • Learn
    • 5G Videos
    • Ebooks
    • EE Training Days
    • FAQs
    • Learning Center
    • Tech Toolboxes
    • Webinars/Digital Events
  • Handbooks
    • 2024
    • 2023
    • 2022
    • 2021
  • Resources
    • Design Guide Library
    • EE World Digital Issues
    • Engineering Diversity & Inclusion
    • Engineering Training Days
    • LEAP Awards
  • Advertise
  • Subscribe

Teaching Self-Driving Cars to Predict Pedestrian Movement

By University of Michigan | February 14, 2019

By zeroing in on humans’ gait, body symmetry and foot placement, University of Michigan researchers are teaching self-driving cars to recognize and predict pedestrian movements with greater precision than current technologies.

Data collected by vehicles through cameras, LiDAR and GPS allow the researchers to capture video snippets of humans in motion and then recreate them in 3D computer simulation. With that, they’ve created a “biomechanically inspired recurrent neural network” that catalogs human movements.

With it, they can predict poses and future locations for one or several pedestrians up to about 50 yards from the vehicle. That’s at about the scale of a city intersection.

“Prior work in this area has typically only looked at still images. It wasn’t really concerned with how people move in three dimensions,” said Ram Vasudevan, U-M assistant professor of mechanical engineering. “But if these vehicles are going to operate and interact in the real world, we need to make sure our predictions of where a pedestrian is going doesn’t coincide with where the vehicle is going next.”

Equipping vehicles with the necessary predictive power requires the network to dive into the minutiae of human movement: the pace of a human’s gait (periodicity), the mirror symmetry of limbs, and the way in which foot placement affects stability during walking.

Much of the machine learning used to bring autonomous technology to its current level has dealt with two dimensional images—still photos. A computer shown several million photos of a stop sign will eventually come to recognize stop signs in the real world and in real time.

But by utilizing video clips that run for several seconds, the U-M system can study the first half of the snippet to make its predictions, and then verify the accuracy with the second half.

“Now, we’re training the system to recognize motion and making predictions of not just one single thing—whether it’s a stop sign or not—but where that pedestrian’s body will be at the next step and the next and the next,” said Matthew Johnson-Roberson, associate professor in U-M’s Department of Naval Architecture and Marine Engineering.

To explain the kind of extrapolations the neural network can make, Vasudevan describes a common sight.

“If a pedestrian is playing with their phone, you know they’re distracted,” Vasudevan said. “Their pose and where they’re looking is telling you a lot about their level of attentiveness. It’s also telling you a lot about what they’re capable of doing next.”

The results have shown that this new system improves upon a driverless vehicle’s capacity to recognize what’s most likely to happen next.

“The median translation error of our prediction was approximately 10 cm after one second and less than 80 cm after six seconds. All other comparison methods were up to 7 meters off,” Johnson-Roberson said. “We’re better at figuring out where a person is going to be.”

To rein in the number of options for predicting the next movement, the researchers applied the the physical constraints of the human body—our inability to fly or our fastest possible speed on foot.

To create the dataset used to train U-M’s neural network, researchers parked a vehicle with Level 4 autonomous features at several Ann Arbor intersections. With the car’s cameras and LiDAR facing the intersection, the vehicle could record multiple days of data at a time.

Researchers bolstered that real-world, “in the wild” data from traditional pose data sets captured in a lab. The result is a system that will raise the bar for what driverless vehicles are capable of.

“We are open to diverse applications and exciting interdisciplinary collaboration opportunities, and we hope to create and contribute to a safer, healthier, and more efficient living environment,” said U-M research engineer Xiaoxiao Du.

A paper on the work is published under early access online in IEEE Robotics and Automation Letters. It will appear in a forthcoming print edition. The work was supported by a grant from Ford Motor Company.


Filed Under: Wireless Design and Development

 

Next Article

← Previous Article
Next Article →

Related Articles Read More >

2.4 GHz chip antennas connect IoT devices to networks
Second-generation tech extends range by 50 percent
Sequans announces production of latest LTE module
Transceiver supports 2.4 GHz ISM and SATCOM

Featured Contributions

  • Overcome Open RAN test and certification challenges
  • Wireless engineers need AI to build networks
  • Why AI chips need PCIe 7.0 IP interconnects
  • circuit board timing How timing and synchronization improve 5G spectrum efficiency
  • Wi-Fi 7 and 5G for FWA need testing
More Featured Contributions

EE TECH TOOLBOX

“ee
Tech Toolbox: Internet of Things
Explore practical strategies for minimizing attack surfaces, managing memory efficiently, and securing firmware. Download now to ensure your IoT implementations remain secure, efficient, and future-ready.

EE LEARNING CENTER

EE Learning Center
“5g
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for EE professionals.

Engineering Training Days

engineering
“bills
5G Technology World
  • Enews Signup
  • EE World Online
  • DesignFast
  • EDABoard Forums
  • Electro-Tech-Online Forums
  • Microcontroller Tips
  • Analogic Tips
  • Connector Tips
  • Engineer’s Garage
  • EV Engineering
  • Power Electronic Tips
  • Sensor Tips
  • Test and Measurement Tips
  • About Us
  • Contact Us
  • Advertise

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy

Search 5G Technology World