BEGIN:VCALENDAR
VERSION:2.0
PRODID:Linklings LLC
BEGIN:VTIMEZONE
TZID:Australia/Melbourne
X-LIC-LOCATION:Australia/Melbourne
BEGIN:DAYLIGHT
TZOFFSETFROM:+1000
TZOFFSETTO:+1100
TZNAME:AEDT
DTSTART:19721003T020000
RRULE:FREQ=YEARLY;BYMONTH=4;BYDAY=1SU
END:DAYLIGHT
BEGIN:STANDARD
DTSTART:19721003T020000
TZOFFSETFROM:+1100
TZOFFSETTO:+1000
TZNAME:AEST
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20260114T163641Z
LOCATION:Meeting Room C4.9+C4.10\, Level 4 (Convention Centre)
DTSTART;TZID=Australia/Melbourne:20231214T144000
DTEND;TZID=Australia/Melbourne:20231214T145000
UID:siggraphasia_SIGGRAPH Asia 2023_sess132_papers_528@linklings.com
SUMMARY:Content-based Search for Deep Generative Models
DESCRIPTION:Daohan Lu, Sheng-Yu Wang, Nupur Kumari, Rohan Agarwal, and Mia
  Tang (Carnegie Mellon University); David Bau (Northeastern University); a
 nd Jun-Yan Zhu (Carnegie Mellon University)\n\nThe growing proliferation o
 f customized and pretrained generative models has made it infeasible for a
  user to be fully cognizant of every model in existence. To address this n
 eed, we introduce the task of content-based model search: given a query an
 d a large set of generative models, finding the models that best match the
  query. As each generative model produces a distribution of images, we for
 mulate the search task as an optimization problem to select the model with
  the highest probability of generating similar content as the query. \nWe 
 introduce a formulation to approximate this probability given the query fr
 om different modalities, e.g., image, sketch, and text. Furthermore, we pr
 opose a contrastive learning framework for model retrieval, which learns t
 o adapt features for various query modalities. We demonstrate that our met
 hod outperforms several baselines on Generative Model Zoo, a new benchmark
  we create for the model retrieval task.\n\nRegistration Category: Full Ac
 cess\n\nSession Chair: Jun-Yan Zhu (Carnegie Mellon University)\n\n
URL:https://asia.siggraph.org/2023/full-program?id=papers_528&sess=sess132
END:VEVENT
END:VCALENDAR
