export tutorials changed in db9df8b

This commit is contained in:
dario-coscia
2025-05-05 08:59:15 +00:00
committed by Dario Coscia
parent a94791f0ff
commit e3d4c2fc1a
23 changed files with 737 additions and 727 deletions

View File

@@ -2,13 +2,13 @@
# coding: utf-8
# # Tutorial: Data structure for SciML: `Tensor`, `LabelTensor`, `Data` and `Graph`
#
#
# [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/mathLab/PINA/blob/master/tutorials/tutorial19/tutorial.ipynb)
#
#
# In this tutorial, well quickly go through the basics of Data Structures for Scientific Machine Learning, convering:
# 1. **PyTorch Tensors** / **PINA LabelTensors**
# 2. **PyTorch Geometric Data** / **PINA Graph**
#
#
# first let's import the data structures we will use!
# In[ ]:
@@ -34,9 +34,9 @@ from pina import LabelTensor, Graph
# ## PyTorch Tensors
#
#
# A **tensor** is a multi-dimensional matrix used for storing and manipulating data in PyTorch. It's the basic building block for all computations in PyTorch, including deep learning models.
#
#
# You can create a tensor in several ways:
# In[2]:
@@ -89,13 +89,13 @@ if torch.cuda.is_available():
# To know more about PyTorch Tensors, see the dedicated tutorial done by the PyTorch team [here](https://pytorch.org/tutorials/beginner/introyt/tensors_deeper_tutorial.html).
# ## Label Tensors
#
#
# In scientific machine learning, especially when working with **Physics-Informed Neural Networks (PINNs)**, handling tensors effectively is crucial. Often, we deal with many indices that represent physical quantities such as spatial and temporal coordinates, making it vital to ensure we use the correct indexing.
#
#
# For instance, in PINNs, if the wrong index is used to represent the coordinates of a physical domain, it could lead to incorrect calculations of derivatives, integrals, or residuals. This can significantly affect the accuracy and correctness of the model.
#
#
# ### What are Label Tensors?
#
#
# **Label Tensors** are a specialized type of tensor used to keep track of indices that represent specific labels. Similar to torch tensor we can perform operation, but the slicing is simplified by using indeces:
# In[7]:
@@ -109,12 +109,12 @@ label_tensor = LabelTensor(tensor, labels=["x", "y"])
# The label tensor is initialized by passing the tensor, and a set of labels. Specifically, the labels must match the following conditions:
#
#
# - At each dimension, the number of labels must match the size of the dimension.
# - At each dimension, the labels must be unique.
#
#
# For example:
#
#
# In[9]:
@@ -137,9 +137,9 @@ print(f"{tensor.full_labels=}")
# ### Label Tensors slicing
#
# One of the powerful features of label tensors is the ability to easily slice and extract specific parts of the tensor based on labels, just like regular PyTorch tensors but with the ease of labels.
#
#
# One of the powerful features of label tensors is the ability to easily slice and extract specific parts of the tensor based on labels, just like regular PyTorch tensors but with the ease of labels.
#
# Heres how slicing works with label tensors. Suppose we have a label tensor that contains both spatial and temporal data, and we want to slice specific parts of this data to focus on certain time intervals or spatial regions.
# In[26]:
@@ -181,16 +181,16 @@ print(f"Similar to: {label_tensor[slice(0, 4, 2), :]=}")
# ## PyTorch Geometric Data
# PyTorch Geometric (PyG) extends PyTorch to handle graph-structured data. It provides utilities to represent graphs and perform graph-based learning tasks such as node classification, graph classification, and more.
#
#
# ### Graph Data Structure
# PyTorch Geometric uses a custom `Data` object to store graph data. The `Data` object contains the following attributes:
#
#
# - **x**: Node features (tensor of shape `[num_nodes, num_features]`)
#
#
# - **edge_index**: Edge indices (tensor of shape `[2, num_edges]`), representing the graph's connectivity
#
#
# - **edge_attr**: Edge features (optional, tensor of shape `[num_edges, num_edge_features]`)
#
#
# - **y**: Target labels for nodes/graphs (optional)
# In[32]:
@@ -229,7 +229,7 @@ print(out) # Output node features after applying GCN
# ## PINA Graph
#
#
# If you've understood Label Tensors and Data in PINA, then you're well on your way to grasping how **PINA Graph** works. Simply put, a **Graph** in PINA is a `Data` object with extra methods for handling label tensors. We highly suggest to use `Graph` instead of `Data` in PINA, expecially when using label tensors.
# In[36]:
@@ -296,13 +296,13 @@ input_[0]
# ## What's Next?
#
#
# Congratulations on completing the tutorials on the **PINA Data Structures**! You now have a solid foundation in using the different data structures within PINA, such as **Tensors**, **Label Tensors**, and **Graphs**. Here are some exciting next steps you can take to continue your learning journey:
#
#
# 1. **Deep Dive into Label Tensors**: Check the documentation of [`LabelTensor`](https://mathlab.github.io/PINA/_rst/label_tensor.html) to learn more about the available methods.
#
#
# 2. **Working with Graphs in PINA**: In PINA we implement many graph structures, e.g. `KNNGraph`, `RadiusGraph`, .... see [here](https://mathlab.github.io/PINA/_rst/_code.html#graphs-structures) for further details.
#
#
# 3. **...and many more!**: Consider exploring `LabelTensor` for PINNs!
#
#
# For more resources and tutorials, check out the [PINA Documentation](https://mathlab.github.io/PINA/).