Print
Category: English
Hits: 1328

10:00 - 12:00 30/06/2020

Room F207

Existence and regularity for time fractional diffusion equations and systems 

Trần Bảo Ngọc

Abstract: In this seminar, we talk about time fractional diffusion equations containing Caputo’s fractional derivatives. We focus on

-        Existence and regularity of final value problems for time diffusion equations

-        Existence and regularity of final value problems for time diffusion systems

We discuss both linear and nonlinear cases. Here, spectral theory and fixed point theorems are mainly employed to establish existence and uniqueness of solutions. Then, by making uses of Sobolev embeddings on the Hilbert scales and fractional Sobolev spaces, we obtain some use regularities for the solution.


Speaker: Lý Kim Hà

A brief tour of Cauchy - Riemann equations

Abstract: The main purpose of this talk is to provide a (very) short history of Cauchy-Riemann equations from 1960's. Some related problems are also mentioned.

10:00 9/1/2020, Room F207.


Seminar on PDE

4/12/2019

Room E 202B

Calderon-Zygmund theory for nonlinear PDEs and applications

14:00 – 15:00

Speaker: Prof. Tuoc Phan (University of Tennessee – Knoxville, US)

Well-posedness of a fractional degenerate forward-backward problem 

15:00 – 16:00

Speaker: Prof. Tan Do (Vietnamese-German University, Vietnam)


Mitigating the Cost of data-driven PDE-constrained Inverse Problems Using Dimensionality Reduction and Deep Learning 

Bùi Thanh Tân, Univ. of Texas Austin

9:00 13-06-2019

Room F207

Abstract: 

Given a hierarchy of reduced-order models to solve the inverse problems for quantities of interest, each model with varying levels of fidelity and computational cost, a deep learning framework is proposed to improve the models by learning the errors between each successive levels. 

By statistically modeling errors of reduced order models and using training data involving forward solves of the reduced order models and the higher fidelity model, we train deep neural networks to learn the error between successive levels of the hierarchy of reduced order models thereby improving their error bounds. The training of the deep neural networks occurs during the offline phase and the error bounds can be improved online as new training data is observed. 

To mitigate the big-data aspect in inverse problems, we have developed a randomized misfit approach that blends random projection theory in high dimensions and inverse problem theory to effectively reduce high-dimensional data while preserving the accuracy of inverse solution.