Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Introduction:vialab [2022/06/14 14:08]
hyjeong [Introduction to ViaLab (Sept. 2020 ~ )]
Introduction:vialab [2024/02/15 06:51] (current)
hyjeong [Automated Valet Parking (AVP)]
Line 1: Line 1:
 {{indexmenu_n>​1}} {{indexmenu_n>​1}}
 +
 ====== Introduction to ViaLab (Sept. 2020 ~ ) ====== ====== Introduction to ViaLab (Sept. 2020 ~ ) ======
  
-The 4th industrial revolution,​(Industry 4.0) is the ongoing automation of traditional manufacturing and industrial practices using modern smart technologies,​ such as Internet of things (IoT), cyber-physical systems (CPS), cloud computing, intelligent robotics, and artificial intelligence (AI). It will fundamentally change the way we live, work, and relate to one another ​with unprecedented scale, scope, and complexity. Fusing the last decade of our R&D experiences on smart systems with the newly emerging AI technology, our research activity in the <color #​ed1c24>​**Vehicle Intelligence and Autonomy Lab (ViaLab)**</​color>​ focuses on the enabling technologies of **unmanned autonomous vehicles**, such as self-driving cars, pipeline robots, and autonomous ground ​vehicles (AGVs).+The 4th industrial revolution,​(Industry 4.0) is the ongoing automation of traditional manufacturing and industrial practices using modern smart technologies,​ such as Internet of things (IoT), cyber-physical systems (CPS), cloud computing, intelligent robotics, and artificial intelligence (AI). It will fundamentally change the way we live, work, and relate to each other with unprecedented scale, scope, and complexity. Fusing the last decade of our R&D experiences on smart systems with the newly emerging AI technology, our research activity in the <color #​ed1c24>​**Vehicle Intelligence and Autonomy Lab (ViaLab)**</​color>​ focuses on the enabling technologies of <color #ed1c24>**unmanned autonomous vehicles**</​color>​, such as self-driving cars, pipeline robots, and automated guided ​vehicles (AGVs).
  
-=== Self-driving Cars for Smart Mobility === 
  
-A self-driving ​car is a vehicle that is capable of sensing its environment and moving safely without human intervention. One of the key challenges in self-driving car is the adoption of <color #​ed1c24>​**artificial intelligence (AI)**</​color>​ technology to achieve human-level perception and understanding of driving environment. ​  +==== Self-driving ​Cars ====
-{{ :​Introduction:​perception_horizontal.mp4?​960x480 | Object Detection (Top) and Semantic Segmentation (Bottom)}}+
  
 +A self-driving car is a vehicle that is capable of sensing its environment and moving safely without human intervention. One of the key challenges in self-driving car is the adoption of <color #​241ced>​**artificial intelligence (AI)**</​color>​ technology to achieve human-level perception and understanding of driving environments. ​
  
 +== SONATA Platform ==
 +We built a self-driving vehicle platform based on Hyundai SONATA DN8. The roof sensor module consists of a 128-channel LiDAR, 6 short-range cameras covering 360 degrees, 1 long-range camera at the front side. To better detect vehicles entering from the other roads at the intersection,​ 16-channel LiDAR and 77 GHz radar are mounted on both front sides of the vehicle, as well. With a tactical-grade inertial navigation system (INS) for precise tracking of the vehicle trajectory, this self-driving vehicle platform can be used to construct high-precision road maps or AI datasets for autonomous driving.
 +{{ Gallery:​sonata.png?​960x560 |SONATA}}
 +
 +== Object Detection and Image Segmentation ==
 +The video below shows two different approaches to visual perception of urban road images in front of PNU main gate: <color #​241ced>​**object detection**</​color>​ (left) and <color #​241ced>​**semantic segmentation**</​color>​ (right). The objection detection network classifies the type of an object and localizes it using the bounding box, while the semantic segmentation network partitions an image into multiple segments each of which represents the type of object. Our object detection network is trained to directly recognize the phase of horizontal traffic light in Korean roads, which is a basic requirement of level-4 autonomous driving. The semantic segmentation network can be used to extract the drivable road area where the planning task figures out a collision-free path. 
 +{{ :​Introduction:​perception_horizontal.mp4?​960x560 | Object Detection (Left) and Semantic Segmentation (Right)}}
 +
 +== 3-D Object Detection ==
 +We can also utilize deep neural networks to detect road objects based on LiDAR pointclouds. Using 128-CH Ouster LiDAR pointcloud as the input, the video (3 X speed) below shows the 3-D bounding boxes of vehicles, buses, motorcycles,​ and pedestrians. A self-driving vehicle can improve the accuracy and reliability of detecting road objects by the fusion of camera, LiDAR, and radar sensors.
 +{{ - :​Introduction:​lidar_object_detection.mp4?​960x560 | LiDAR Object Detection}}
 +
 +== Vehicle Control by Joystick Maneuvers ==
 +We also connect our computation server with the vehicle actuation system via X-by-wire interface so that the control signal can be converted to the adaptive cruise commands of commercial vehicle (Hyundai SONATA). The video below shows that the steering wheel of our vehicle can be controlled by the joystick commands:
 {{ :​Introduction:​joystick_control.mp4?​960x460 | Steering Wheel Controlled by Joystick}} {{ :​Introduction:​joystick_control.mp4?​960x460 | Steering Wheel Controlled by Joystick}}
  
  
-=== Pipeline Robots for Non-Destructive In-Line Inspection ​===+==== Automated Valet Parking (AVP) ==== 
 +Automated valet parking (AVP) is one of the scenarios that will be commercialized the fastest among various self-driving scenarios. It is also a promising self-driving technology that can be applied to the automated guided vehicles (AGVs) for the automation of smart factories and logistics warehouses.
  
-We are currently collaborating with KOGAS reseach team+== Golf Cart Platform == 
 +We built the AVP golf cart platform that consists of a sensor module, comprising of a 64-channel LiDAR, an inertial navigation system (INS), and a braking pedal sensor, an information processing module inside the trunk, and a control module for steering and acceleration controls. 
 +{{ Gallery:​golfcart.png?​960x560 |Golf Cart Platform}}
  
-=== Autonomous Ground Vehicle Control System (ACS) for Smart Factory ===  
  
-To coordinate ​the access of multiple AGVs to the shared resourcessuch as intersectionwe are currently developing an <color #ed1c24>**open-source,​ platform-independent, ​and vendor-independent AGV control system ​(ACS)**</​color>​ which will be actually deployed in factory ​of [[https://www.swhitech.com |Sungwoo HiTech]] ​in Nov2022+== Control Authority Switching == 
 +In the manual driving mode, the driver requests parking through the AVP app, and then the golf cart switches ​to the self-driving mode. At an emergency situation during the self-driving modethe control authority switching system allows the driver to press the braking pedal in order to immediately react to the emergency in the manual driving mode. 
 +{{ :​Introduction:​cas.mp4?​960x560 | Control Authority Switching}} 
 + 
 + 
 +== SLAM for Electric Vehicle == 
 +The core technology of self-driving is high-precision vehicle positioning. To this endthe AVP golf cart utilizes the 3-D LiDAR pointclouds and the inertial navigation system (INS) measurements. The AVP golf cart constructs a map of driving environments through the scan matching between LiDAR pointclouds and 3-D high-precision map, and simultaneously detects its real-time position through the <color #241ced>**simultaneous localization ​and mapping ​(SLAM)**</​color> ​technology. 
 +{{ :​Introduction:​EVL.mp4?​960x560 | SLAM for Electric Vehicle}} 
 + 
 + 
 +=== AVP Demo at PNU Campus === 
 +AVP enables vehicles to automatically perform parking tasks without driver intervention, ​which includes finding ​parking space within a designated area, navigating to the spot, and completing the parking maneuver within a designated parking slot. 
 + 
 +When the driver arrives at the entrance ​of parking lot via manual driving, and requests valet parking service using a smartphone AVP app, the golf cart creates a shortest path from the current position to the destination parking slot. The model predictive control (MPC) module determines the speed and steering of the golf cart during the self-driving. The video below demonstrates that our AVP golf cart can park successfully through self-driving at the PNU Jangjeon campus in an <color #​241ced>​**exclusive traffic**<​/color> scenario. 
 +{{ :​Introduction:​avp_et.mp4?​960x560 ​AVP Demo}} 
 + 
 +== AVP Demo with Mixed Traffic == 
 +In a <color #​241ced>​**mixed traffic**</​color>​ scenario, our AVP golf cart accurately perceives both stationary and dynamic objects ​in its surrounding environment,​ enabling it to generate safe paths and avoid collisions in real-timeThis advanced capability allows the AVP golf cart to confidently navigate and park itself in unstructured parking areas with arbitrary traffic, eliminating the need for driver intervention. 
 +{{ :​Introduction:​avp_mt.mp4?​960x560 | AVP with Mixed Traffic}}
    
-The demonstration ​below shows that our ACS can successfully coordinate the simultaneous access of two [[https://​www.aiki-tcs.co.jp/​carrybee?​lang=en | Aichi CarryBee]] AGVs to the intersection ​(around 1:00 of the video play) + 
-{{ :​Introduction:​acs_demo.mp4?​960x540 | MVACS Demo}}+==== Automated Guided Vehicles Control System (ACS) ====  
 + 
 +To coordinate the access of multiple AGVs to the shared resources, such as intersection,​ we are currently developing an <color #​ed1c24>​**open-source,​ platform-independent,​ and vendor-independent AGV control system (ACS)**</​color>​ which has been actually deployed in a factory of [[https://​www.swhitech.com |Sungwoo HiTech]] from December 2023.  
 +  
 +The <color #​ed1c24>​**AGV abstraction layer (AAL)**</​color>​ of our ACS mitigates the protocol inconsistency over multi-vendor AGVs, and provide the unified protocol interface to the core modules of our ACS system. The video (2 X speed) ​below shows that our ACS can successfully coordinate the simultaneous access of [[https://​www.meidensha.com/​products/​logistics/​prod_01/​index.html | Meidensha]] and [[https://​www.aiki-tcs.co.jp/​carrybee?​lang=en | Aichi CarryBee]] AGVs at the intersection: ​ 
 +{{ :​Introduction:​agv_final.mp4?​960x540 | ACS Traffic Coodination}} 
 + 
 + 
 +== ACS Deployment at Sungwoo HiTech == 
 +Our ACS has been successfully deployed at a new production line of Sungwoo HiTech'​s Seo-Chang factory in December 2023. The ACS efficiently manages ​the flow of AGVs at intersections,​ optimizing scheduling and dispatching through integration with Sungwoo HiTech'​s Manufacturing Execution System (MES). 
 +{{ :​Introduction:​acs_seochang_factory.mp4?​960x540 |ACS @ Seo-Chang}} 
 + 
 +=== Non-Destructive In-Line Inspection for Smart Infrastructure === 
 + 
 +To be announced soon... 
  
 ====== Introduction to NSSLab (Sept. 2008 ~ Aug. 2020) ====== ====== Introduction to NSSLab (Sept. 2008 ~ Aug. 2020) ======
Navigation