Home > Technology peripherals > AI > body text

HANDS@ECCV24 Hand Symposium and Challenge, we sincerely invite submissions and participation in the competition

WBOY
Release: 2024-07-12 14:12:10
Original
892 people have browsed it

HANDS@ECCV24 手部研讨会和挑战赛,诚邀投稿和参与竞赛

Introduction

The 8th HANDS will be held at ECCV24 (afternoon on September 30, Milan), including workshops and challenges. HANDS will provide a platform for related hand researchers and practitioners to share their work and discuss potential collaborations. The past seven editions of HANDS have also achieved great success. This document is a translation, please refer to the official website for information.

HANDS@ECCV24 Homepage: https://www.php.cn/link/73d4d7b15bfefa13c4a035fa16bb99ed

Hand seminar and submission: The seminar focuses on hand-related directions and will invite experts in hand-related fields Do cutting-edge reporting. In particular, we sincerely invite submissions of relevant long articles.

Hand Challenge: Based on the latest hand big data sets AssemblyHands, ARCTIC, OakInk2 and UmeTrack, the challenge organizes multiple tracks and hopes to promote the development of related algorithms.

Paper submission

Submission URL: https://www.php.cn/link/b8c4464e7ee7dc326d7807a8549e449f

Submission direction: all hand-related directions, including but not limited to hand posture estimation, hand Reconstruction, hand-object interaction, gesture analysis, gesture generation, sign language recognition, robot control and robot grabbing, etc.

Submission rules: Long articles are subject to a single round of double-blind review without a rebuttal stage, and adopt the official ECCV24 template and restrictions. After admission, it will be published in the ECCV24 conference proceedings. We also accept long abstract and poster submissions. Long abstracts and posters will not be published, but facilitate communication and promotion of related work. See the official website for details.

Challenge

We organize 4 track challenges based on the latest big hand data sets AssemblyHands, ARCTIC, OakInk2 and UmeTrack. Challenge winners will receive corresponding rewards and promote their methods at workshops. The dataset is now available for download and everyone is welcome to participate. To participate in the competition, please register on the official website first, and please abide by the relevant regulations during the challenge. Please pay attention to the relevant track for specific time points.

For more information, please see the official website of the challenge: https://www.php.cn/link/73d4d7b15bfefa13c4a035fa16bb99edchallenge2024.html

HANDS@ECCV24 手部研讨会和挑战赛,诚邀投稿和参与竞赛

Perspective generalization hand pose estimation challenge based on AssemblyHands

should The challenge provides an expanded version of the AssemblyHands Challenge in HANDS2023. As multi-view AR/VR headsets (such as Apple Vision Pro, Meta Quest, etc.) receive increasing attention, we extend the field of 3D hand pose estimation from single-view to multi-view settings. In this challenge, contestants need to unsupervisedly optimize the provided pre-trained model and adapt it to multiple dual-view settings, for which the camera parameters are unknown. Specifically, we build the track using multi-view data from the AssemblyHands dataset.

HANDS@ECCV24 手部研讨会和挑战赛,诚邀投稿和参与竞赛

ARCTIC-based two-hand hand-object interaction 3D reconstruction challenge

Humans interact with various objects every day, so the overall 3D capture of these interactions is crucial for human behavior modeling. Since we naturally interact with our hands, we propose the Hands Class-agnostic Reconstruction Challenge to reconstruct 3D information of hands and objects from video clips without relying on pre-scanned templates. This task is more challenging because bimanual manipulation exhibits severe hand-to-object occlusion and dynamic hand-to-object contact.

HANDS@ECCV24 手部研讨会和挑战赛,诚邀投稿和参与竞赛

Hand-object interaction trajectory generation challenge based on OakInk2

The focus of this challenge is to use object functionality to synthesize physically feasible hand-object interaction. The goal is to use demonstration trajectories in existing datasets to generate hand-object interaction trajectories that complete specified functions in a specified physical simulation environment. Competitors are required to synthesize trajectories that can be deployed in a simulated environment. These trajectories are required to exploit the functionality of the object to achieve the mission goal from a given mission initial state. We use the OakInk2 dataset (Zhan et al. CVPR'24) as the source of the demo trajectories in this challenge. OakInk2 contains demonstrations of hand and object interaction centered on implementing object affordances. In this challenge, we provide a transfer demonstration from a human hand (MANO) to a dexterous avatar (ShadowHand) for the IsaacGyn environment.

HANDS@ECCV24 手部研讨会和挑战赛,诚邀投稿和参与竞赛

Based on the MegoTrack multi-view egocentric hand tracking challenge

This challenge aims to promote the development of practical hand tracking algorithms tailored for AR/VR systems using multi-view egocentric cameras. Participants will have the opportunity to leverage the unique capabilities of AR/VR systems to enhance the robustness of hand tracking algorithms. This year, two different tasks are launched:

1. Hand reconstruction: Determine hand reconstruction using calibrated stereoscopic video.

2. Hand pose estimation: Utilize pre-calibrated hand shapes to track hand poses in calibrated stereoscopic videos.

To encourage the development of generalizable methods, we will employ two datasets with different configurations: UmeTrack and HOT3D, the latter recently launched at CVPR. +

HANDS@ECCV24 手部研讨会和挑战赛,诚邀投稿和参与竞赛

The above is the detailed content of HANDS@ECCV24 Hand Symposium and Challenge, we sincerely invite submissions and participation in the competition. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:jiqizhixin.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!