[Seminar] Co-Optimizing Human-System Performance in VR/AR
Friday, November 4, 2022
11:00 am - 12:00 pm
Speaker
Dr.
Qi
Sun
New
York
University
Location
PGH 232
Abstract
Virtual
and
Augmented
Reality
enables
unprecedented
possibilities
for
displaying
virtual
content,
sensing
physical
surroundings,
and
tracking
human
behaviors
with
high
fidelity.
However,
we
still
haven鈥檛
created
鈥渟uperhumans鈥
who
can
outperform
what
we
are
in
physical
reality,
nor
a
鈥減erfect鈥
XR
system
that
delivers
infinite
battery
life
or
realistic
sensation.
In
this
talk,
I
will
discuss
some
of
our
recent
research
on
leveraging
eye/muscular
sensing
and
learning
to
model
our
perception,
reaction,
and
sensation
in
virtual
environments.
Based
on
the
knowledge,
we
create
just-in-time
visual
content
that
jointly
optimizes
human
(such
as
reaction
speed
to
events)
and
system
performance
(such
as
reduced
display
power
consumption)
in
XR.
About the Speaker
Qi Sun is an assistant professor at New York University. Before joining NYU, he was a research scientist at Adobe Research. He received his PhD at Stony Brook University. His research interests lie in perceptual computer graphics, VR/AR, computational cognition, and visual optics. He is a recipient of the IEEE Virtual Reality Best Dissertation Award, with his research recognized as Best Paper and Honorable Mention awards in ACM SIGGRAPH. His research is funded by NASA, NSF, DARPA, NVIDIA, and Adobe.
Host
Dr. Zhigang Deng
