diff --git a/tutorials/tutorial10/tutorial.ipynb b/tutorials/tutorial10/tutorial.ipynb index 662ffad..d8e6fb6 100644 --- a/tutorials/tutorial10/tutorial.ipynb +++ b/tutorials/tutorial10/tutorial.ipynb @@ -382,7 +382,7 @@ "\n", "1. Train the network for longer or with different layer sizes and assert the finaly accuracy\n", "\n", - "2. We left a more challenging dataset [Data_KS2.mat](tutorial10/dat/Data_KS2.mat) where $A_k \\in [-0.5, 0.5]$, $\\ell_k \\in [1, 2, 3]$, $\\phi_k \\in [0, 2\\pi]$ for loger training\n", + "2. We left a more challenging dataset [Data_KS2.mat](dat/Data_KS2.mat) where $A_k \\in [-0.5, 0.5]$, $\\ell_k \\in [1, 2, 3]$, $\\phi_k \\in [0, 2\\pi]$ for loger training\n", "\n", "3. Compare the performance between the different neural operators (you can even try to implement your favourite one!)" ]