Please note: This seminar will take place in DC 1304.
Arun
Jambulapati,
Postdoctoral
Researcher
Computer
Science
and
Engineering,
University
of
Michigan
The
computational
challenges
posed
by
recent
massive
datasets
motivate
the
design
of
more
efficient
algorithms.
My
work
takes
problems
motivated
by
modern
machine
learning
and
develops
theoretical
tools
to
answer
two
central
questions:
1)
how
can
we
learn
from
data
faster,
and
2)
how
can
we
select
representative
data
(to
potentially
speed
up
downstream
processing).
I
will
first
present
several
results
on
faster
convex
optimization
leveraging
a
new
theoretical
primitive
called
a
“ball
optimization
oracle”.
I
will
give
near-optimal
algorithms
for
minimizing
convex
functions
assuming
access
to
this
oracle
and
show
how
this
framework
can
be
applied
to
yield
faster
algorithms
for
important
problems
in
computer
science
such
as
regression,
differentially
private
optimization,
and
robust
optimization.
I
will
follow
with
results
on
problems
motivated
by
dataset
compression.
Leveraging
techniques
from
high-dimensional
geometry,
I
give
near-optimal
constructions
of
sparsifiers
for
sums
of
norms
and
generalized
linear
models.
This
directly
implies
new
sparsifiers
for
hypergraphs,
sums
of
symmetric
submodular
functions,
and
gives
faster
algorithms
for
linear
regression.
Bio: Arun Jambulapati is a postdoc at the University of Michigan working with Thatchaphol Saranurak and Nikhil Bansal. Previously, he was a Simons Research Fellow and a postdoc at the University of Washington; he completed his Ph.D from Stanford in 2022.
His primary research interests are in the design of fast algorithms and sparsification. His work has been supported by an NSF Graduate Research Fellowship and a Stanford Graduate Research Fellowship.