Thursday, April 25, 2024
  • Procore Leaderboard 2024
  • Revizto - Leaderboard - March and April
  • Keith Walking Floor - Leaderboard - Sept 2021
  • Dentec - Leaderboard - 2023 - Updated
  • IAPMO R&T Lab - Leaderboard
  • CWRE 2024 - Leaderboard
  • Premier Leaderboard - updated Nov 19
September 19, 2018

Legal and safety issues loom around ethics, AI, and robots

 

As artificial intelligence and machine learning algorithms continue to advance, making sure that the AI entities can explain their decision-making process will be extremely important, especially after an accident that results in injury or death, says a leading legal expert on workplace issues.

“We’re building robots and machines driven by AI, we’re putting them into the workplace, and they are becoming more complex by the month, but our ability to control them in terms of what they do and the decisions they make becomes more limited every day,” said Matthew Linton, Of Counsel at Ogletree, Deakins, Nash, Smoak & Stewart, P.C. “Now that we’ve created some algorithms that can do pretty amazing things with data, how do we get the algorithms to explain themselves, in effect get them to show their work?”

Never miss important industry news again – Click here to sign-up and receive the Weekly Round Up in your inbox every Saturday

Linton will be discussing these and other legal issues around AI and robotics at RoboBusiness 2018, to be held Sept. 25-27, 2018, in Santa Clara, Calif. (Robotics Business Review produces RoboBusiness.)

The panel session, “Safe vs. Safer: Is Public Perception on AI and Robotics Changing?” will take place on Thursday, Sept. 27, at 4:45 p.m., as the closing keynote for RoboBusiness. Joining Linton on the panel is Dawn N. Castillo, MPH, from the National Institute for Occupational Safety and Health, and Jeff Burnstein, president of the Association for Advancing Automation (A3). Linton spoke with Robotics Business Review ahead of the show to discuss the topic of workplace safety, AI, ethics and robotics.

Linton outlined a hypothetical example of a 200-ton autonomous coal truck moving down a path when somebody walks in front of the truck. The software has to make a decision – does it stop, go around, sound an alarm? In most cases, the algorithm has rules for what the vehicle does. However, Linton said that the algorithms are not particularly capable of explaining to humans of the reason why it made the decision it made.

Keep reading on RoboticsBusinessReview.com

 


Watch our video and learn more about the benefits of joining Construction Links Network – the peer-to-peer network sharing platform for the construction, building and design community.

Press Releases | Project Updates | New Appointments | Awards & Milestones | Company News | New Products/Services | Brochures | Videos | Infographics | Blog Sharing | Events and More