export tutorials changed in db9df8b
This commit is contained in:
committed by
Dario Coscia
parent
a94791f0ff
commit
e3d4c2fc1a
64
tutorials/tutorial7/tutorial.py
vendored
64
tutorials/tutorial7/tutorial.py
vendored
@@ -2,15 +2,15 @@
|
||||
# coding: utf-8
|
||||
|
||||
# # Tutorial: Inverse Problem Solving with Physics-Informed Neural Network
|
||||
#
|
||||
#
|
||||
# [](https://colab.research.google.com/github/mathLab/PINA/blob/master/tutorials/tutorial7/tutorial.ipynb)
|
||||
#
|
||||
#
|
||||
# ## Introduction to the Inverse Problem
|
||||
#
|
||||
#
|
||||
# This tutorial demonstrates how to solve an inverse Poisson problem using Physics-Informed Neural Networks (PINNs).
|
||||
#
|
||||
#
|
||||
# The problem is defined as a Poisson equation with homogeneous boundary conditions:
|
||||
#
|
||||
#
|
||||
# \begin{equation}
|
||||
# \begin{cases}
|
||||
# \Delta u = e^{-2(x - \mu_1)^2 - 2(y - \mu_2)^2} \quad \text{in } \Omega, \\
|
||||
@@ -18,18 +18,18 @@
|
||||
# u(\mu_1, \mu_2) = \text{data}
|
||||
# \end{cases}
|
||||
# \end{equation}
|
||||
#
|
||||
#
|
||||
# Here, $\Omega$ is the square domain $[-2, 2] \times [-2, 2]$, and $\partial \Omega = \Gamma_1 \cup \Gamma_2 \cup \Gamma_3 \cup \Gamma_4$ represents the union of its boundaries.
|
||||
#
|
||||
#
|
||||
# This type of setup defines an *inverse problem*, which has two primary objectives:
|
||||
#
|
||||
#
|
||||
# - **Find the solution** $u$ that satisfies the Poisson equation,
|
||||
# - **Identify the unknown parameters** $(\mu_1, \mu_2)$ that best fit the given data (as described by the third equation in the system).
|
||||
#
|
||||
#
|
||||
# To tackle both objectives, we will define an `InverseProblem` using **PINA**.
|
||||
#
|
||||
#
|
||||
# Let's begin with the necessary imports:
|
||||
#
|
||||
#
|
||||
|
||||
# In[ ]:
|
||||
|
||||
@@ -45,8 +45,12 @@ if IN_COLAB:
|
||||
get_ipython().system('pip install "pina-mathlab[tutorial]"')
|
||||
# get the data
|
||||
get_ipython().system('mkdir "data"')
|
||||
get_ipython().system('wget "https://github.com/mathLab/PINA/raw/refs/heads/master/tutorials/tutorial7/data/pinn_solution_0.5_0.5" -O "data/pinn_solution_0.5_0.5"')
|
||||
get_ipython().system('wget "https://github.com/mathLab/PINA/raw/refs/heads/master/tutorials/tutorial7/data/pts_0.5_0.5" -O "data/pts_0.5_0.5"')
|
||||
get_ipython().system(
|
||||
'wget "https://github.com/mathLab/PINA/raw/refs/heads/master/tutorials/tutorial7/data/pinn_solution_0.5_0.5" -O "data/pinn_solution_0.5_0.5"'
|
||||
)
|
||||
get_ipython().system(
|
||||
'wget "https://github.com/mathLab/PINA/raw/refs/heads/master/tutorials/tutorial7/data/pts_0.5_0.5" -O "data/pts_0.5_0.5"'
|
||||
)
|
||||
|
||||
import matplotlib.pyplot as plt
|
||||
import torch
|
||||
@@ -68,14 +72,14 @@ warnings.filterwarnings("ignore")
|
||||
seed_everything(883)
|
||||
|
||||
|
||||
# Next, we import the pre-saved data corresponding to the true parameter values $(\mu_1, \mu_2) = (0.5, 0.5)$.
|
||||
# Next, we import the pre-saved data corresponding to the true parameter values $(\mu_1, \mu_2) = (0.5, 0.5)$.
|
||||
# These values represent the *optimal parameters* that we aim to recover through neural network training.
|
||||
#
|
||||
#
|
||||
# In particular, we load:
|
||||
#
|
||||
#
|
||||
# - `input` points — the spatial coordinates where observations are available,
|
||||
# - `target` points — the corresponding $u$ values (i.e., the solution evaluated at the `input` points).
|
||||
#
|
||||
#
|
||||
# This data will be used to guide the inverse problem and supervise the network’s prediction of the unknown parameters.
|
||||
|
||||
# In[11]:
|
||||
@@ -88,10 +92,10 @@ data_input = torch.load("data/pts_0.5_0.5", weights_only=False)
|
||||
|
||||
|
||||
# Next, let's visualize the data:
|
||||
#
|
||||
#
|
||||
# - We'll plot the data points, i.e., the spatial coordinates where measurements are available.
|
||||
# - We'll also display the reference solution corresponding to $(\mu_1, \mu_2) = (0.5, 0.5)$.
|
||||
#
|
||||
#
|
||||
# This serves as the ground truth or expected output that our neural network should learn to approximate through training.
|
||||
|
||||
# In[12]:
|
||||
@@ -107,10 +111,10 @@ plt.show()
|
||||
|
||||
|
||||
# ## Inverse Problem Definition in PINA
|
||||
#
|
||||
# Next, we initialize the Poisson problem, which inherits from the `SpatialProblem` and `InverseProblem` classes.
|
||||
#
|
||||
# Next, we initialize the Poisson problem, which inherits from the `SpatialProblem` and `InverseProblem` classes.
|
||||
# In this step, we need to define all the variables and specify the domain in which our unknown parameters $(\mu_1, \mu_2)$ reside.
|
||||
#
|
||||
#
|
||||
# Note that the Laplace equation also takes these unknown parameters as inputs. These parameters will be treated as variables that the neural network will optimize during the training process, enabling it to learn the optimal values for $(\mu_1, \mu_2)$.
|
||||
|
||||
# In[13]:
|
||||
@@ -189,11 +193,11 @@ problem.discretise_domain(
|
||||
)
|
||||
|
||||
|
||||
# Here, we define a simple callback for the trainer. This callback is used to save the parameters predicted by the neural network during training.
|
||||
# Here, we define a simple callback for the trainer. This callback is used to save the parameters predicted by the neural network during training.
|
||||
# The parameters are saved every 100 epochs as `torch` tensors in a specified directory (in our case, `tutorial_logs`).
|
||||
#
|
||||
#
|
||||
# The goal of this setup is to read the saved parameters after training and visualize their trend across the epochs. This allows us to monitor how the predicted parameters evolve throughout the training process.
|
||||
#
|
||||
#
|
||||
|
||||
# In[16]:
|
||||
|
||||
@@ -266,13 +270,13 @@ plt.show()
|
||||
|
||||
|
||||
# ## What's Next?
|
||||
#
|
||||
#
|
||||
# We have covered the basic usage of PINNs for inverse problem modeling. Here are some possible directions for further exploration:
|
||||
#
|
||||
#
|
||||
# 1. **Experiment with different Physics-Informed strategies**: Explore variations in PINN training techniques to improve performance or tackle different types of problems.
|
||||
#
|
||||
#
|
||||
# 2. **Apply to more complex problems**: Scale the approach to higher-dimensional or time-dependent inverse problems.
|
||||
#
|
||||
#
|
||||
# 3. **...and many more!**: The possibilities are endless, from integrating additional physical constraints to testing on real-world datasets.
|
||||
#
|
||||
#
|
||||
# For more resources and tutorials, check out the [PINA Documentation](https://mathlab.github.io/PINA/).
|
||||
|
||||
Reference in New Issue
Block a user