Update Tensorboard use
This commit is contained in:
committed by
Nicola Demo
parent
b38b0894b1
commit
67a2b0796c
8
tutorials/tutorial1/tutorial.ipynb
vendored
8
tutorials/tutorial1/tutorial.ipynb
vendored
@@ -505,7 +505,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 10,
|
"execution_count": null,
|
||||||
"id": "fcac93e4",
|
"id": "fcac93e4",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [
|
"outputs": [
|
||||||
@@ -546,10 +546,8 @@
|
|||||||
}
|
}
|
||||||
],
|
],
|
||||||
"source": [
|
"source": [
|
||||||
"# Load the TensorBoard extension\n",
|
"print('\\nTo load TensorBoard run load_ext tensorboard on your terminal')\n",
|
||||||
"%load_ext tensorboard\n",
|
"print(\"To visualize the loss you can run tensorboard --logdir 'tutorial_logs' on your terminal\\n\")"
|
||||||
"# Show saved losses\n",
|
|
||||||
"%tensorboard --logdir 'tutorial_logs'"
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
|||||||
8
tutorials/tutorial1/tutorial.py
vendored
8
tutorials/tutorial1/tutorial.py
vendored
@@ -261,13 +261,11 @@ plt.legend()
|
|||||||
|
|
||||||
# The solution is overlapped with the actual one, and they are barely indistinguishable. We can also take a look at the loss using `TensorBoard`:
|
# The solution is overlapped with the actual one, and they are barely indistinguishable. We can also take a look at the loss using `TensorBoard`:
|
||||||
|
|
||||||
# In[10]:
|
# In[ ]:
|
||||||
|
|
||||||
|
|
||||||
# Load the TensorBoard extension
|
print('\nTo load TensorBoard run load_ext tensorboard on your terminal')
|
||||||
get_ipython().run_line_magic('load_ext', 'tensorboard')
|
print("To visualize the loss you can run tensorboard --logdir 'tutorial_logs' on your terminal\n")
|
||||||
# Show saved losses
|
|
||||||
get_ipython().run_line_magic('tensorboard', "--logdir 'tutorial_logs'")
|
|
||||||
|
|
||||||
|
|
||||||
# As we can see the loss has not reached a minimum, suggesting that we could train for longer! Alternatively, we can also take look at the loss using callbacks. Here we use `MetricTracker` from `pina.callback`:
|
# As we can see the loss has not reached a minimum, suggesting that we could train for longer! Alternatively, we can also take look at the loss using callbacks. Here we use `MetricTracker` from `pina.callback`:
|
||||||
|
|||||||
56
tutorials/tutorial2/tutorial.ipynb
vendored
56
tutorials/tutorial2/tutorial.ipynb
vendored
File diff suppressed because one or more lines are too long
7
tutorials/tutorial2/tutorial.py
vendored
7
tutorials/tutorial2/tutorial.py
vendored
@@ -311,12 +311,11 @@ trainer_learn.train()
|
|||||||
|
|
||||||
# Let us compare the training losses for the various types of training
|
# Let us compare the training losses for the various types of training
|
||||||
|
|
||||||
# In[10]:
|
# In[ ]:
|
||||||
|
|
||||||
|
|
||||||
# Load the TensorBoard extension
|
print('To load TensorBoard run load_ext tensorboard on your terminal')
|
||||||
get_ipython().run_line_magic('load_ext', 'tensorboard')
|
print("To visualize the loss you can run tensorboard --logdir 'tutorial_logs' on your terminal")
|
||||||
get_ipython().run_line_magic('tensorboard', "--logdir 'tutorial_logs'")
|
|
||||||
|
|
||||||
|
|
||||||
# ## What's next?
|
# ## What's next?
|
||||||
|
|||||||
99
tutorials/tutorial3/tutorial.ipynb
vendored
99
tutorials/tutorial3/tutorial.ipynb
vendored
File diff suppressed because one or more lines are too long
13
tutorials/tutorial3/tutorial.py
vendored
13
tutorials/tutorial3/tutorial.py
vendored
@@ -194,12 +194,11 @@ trainer.train()
|
|||||||
|
|
||||||
# Let's now plot the logging to see how the losses vary during training. For this, we will use `TensorBoard`.
|
# Let's now plot the logging to see how the losses vary during training. For this, we will use `TensorBoard`.
|
||||||
|
|
||||||
# In[5]:
|
# In[ ]:
|
||||||
|
|
||||||
|
|
||||||
# Load the TensorBoard extension
|
print('\nTo load TensorBoard run load_ext tensorboard on your terminal')
|
||||||
get_ipython().run_line_magic('load_ext', 'tensorboard')
|
print("To visualize the loss you can run tensorboard --logdir 'tutorial_logs' on your terminal\n")
|
||||||
get_ipython().run_line_magic('tensorboard', "--logdir 'tutorial_logs'")
|
|
||||||
|
|
||||||
|
|
||||||
# Notice that the loss on the boundaries of the spatial domain is exactly zero, as expected! After the training is completed one can now plot some results using the `matplotlib`. We plot the predicted output on the left side, the true solution at the center and the difference on the right side using the `plot_solution` function.
|
# Notice that the loss on the boundaries of the spatial domain is exactly zero, as expected! After the training is completed one can now plot some results using the `matplotlib`. We plot the predicted output on the left side, the true solution at the center and the difference on the right side using the `plot_solution` function.
|
||||||
@@ -335,12 +334,12 @@ plt.figure(figsize=(12, 6))
|
|||||||
plot_solution(solver=pinn, time=1)
|
plot_solution(solver=pinn, time=1)
|
||||||
|
|
||||||
|
|
||||||
# We can see now that the results are way better! This is due to the fact that previously the network was not learning correctly the initial conditon, leading to a poor solution when time evolved. By imposing the initial condition the network is able to correctly solve the problem. We can also see using Tensorboard how the two losses decreased:
|
# We can see now that the results are way better! This is due to the fact that previously the network was not learning correctly the initial conditon, leading to a poor solution when time evolved. By imposing the initial condition the network is able to correctly solve the problem. We can also see how the two losses decreased using Tensorboard.
|
||||||
|
|
||||||
# In[11]:
|
# In[ ]:
|
||||||
|
|
||||||
|
|
||||||
get_ipython().run_line_magic('tensorboard', "--logdir 'tutorial_logs'")
|
print("To visualize the loss you can run tensorboard --logdir 'tutorial_logs' on your terminal")
|
||||||
|
|
||||||
|
|
||||||
# ## What's next?
|
# ## What's next?
|
||||||
|
|||||||
Reference in New Issue
Block a user