Optimization of cloth simulation parameters by considering static and dynamic features

Shoji Kunitomo*, Shinsuke Nakamura, Shigeo Morishima

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

12 Citations (Scopus)


Realistic drape and motion of virtual clothing is now possible by using an up-to-date cloth simulator, but it is even difficult and time consuming to adjust and tune many parameters to achieve an authentic looking of a real particular fabric. Bhat et al. [2003] proposed a way to estimate the parameters from the video data of real fabrics. However, this projects structured light patterns on the fabrics, so it might not be possible to estimate the accurate value of the parameters if fabrics have colors and textures. In addition to the structured light patterns, they use a motion capture system to track how the fabrics move. In this paper, we will introduce a new method using only a motion capture system by attaching a few markers on fabric surface without any other devices. Moreover, animators can easily estimate the parameters of many kinds of fabrics with this method. Authentic looking and motion of simulated fabrics are realized by minimizing error function between captured motion data and synthetic motion considering both static and dynamic cloth features.

Original languageEnglish
Title of host publicationACM SIGGRAPH 2010 Posters, SIGGRAPH '10
Publication statusPublished - 2010
EventACM SIGGRAPH 2010 Posters, SIGGRAPH '10 - Los Angeles, CA, United States
Duration: 2010 Jul 262010 Jul 30

Publication series

NameACM SIGGRAPH 2010 Posters, SIGGRAPH '10


ConferenceACM SIGGRAPH 2010 Posters, SIGGRAPH '10
Country/TerritoryUnited States
CityLos Angeles, CA

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Computer Vision and Pattern Recognition
  • Software


Dive into the research topics of 'Optimization of cloth simulation parameters by considering static and dynamic features'. Together they form a unique fingerprint.

Cite this