-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
161908f
commit edeb9e2
Showing
10 changed files
with
1,006 additions
and
618 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
85 changes: 72 additions & 13 deletions
85
docs/machine_learning/supervised_learning/linear_regression_with_sklearn.ipynb
Large diffs are not rendered by default.
Oops, something went wrong.
144 changes: 76 additions & 68 deletions
144
docs/machine_learning/supervised_learning/linear_regression_with_tensorflow.ipynb
Large diffs are not rendered by default.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,249 @@ | ||
{ | ||
"cells": [ | ||
{ | ||
"cell_type": "markdown", | ||
"id": "b39df9bf-2029-4b22-bf83-2496d0d81a16", | ||
"metadata": {}, | ||
"source": [ | ||
"# TensorBoard\n", | ||
"\n", | ||
"TensorBoard provides the visualization and tooling needed for machine learning experimentation:\n", | ||
"\n", | ||
"* Tracking and visualizing metrics such as loss and accuracy\n", | ||
"* Visualizing the model graph (ops and layers)\n", | ||
"* Viewing histograms of weights, biases, or other tensors as they change over time\n", | ||
"* Projecting embeddings to a lower dimensional space\n", | ||
"* Displaying images, text, and audio data\n", | ||
"* Profiling TensorFlow programs\n", | ||
"* [And much more](https://www.tensorflow.org/tensorboard)\n", | ||
"\n", | ||
"```{contents}\n", | ||
":local:\n", | ||
"```" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 1, | ||
"id": "8987b932-8406-478c-a34c-91824687c557", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"# Load the TensorBoard notebook extension\n", | ||
"%load_ext tensorboard" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 2, | ||
"id": "16d0d658-309e-4a50-a5d7-afbeedb20400", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"import datetime\n", | ||
"\n", | ||
"import tensorflow as tf" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 3, | ||
"id": "7c2513ba-765b-46d0-8201-1300e8cce5dd", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"# Clear any logs from previous runs\n", | ||
"!rm -rf ./tensorboard/logs/" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 4, | ||
"id": "014a1061-7f92-49f7-9c77-b5ed44baadd4", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"# Load the MNIST dataset\n", | ||
"mnist = tf.keras.datasets.mnist\n", | ||
"\n", | ||
"(x_train, y_train), (x_test, y_test) = mnist.load_data()\n", | ||
"x_train, x_test = x_train / 255.0, x_test / 255.0" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 8, | ||
"id": "1e861829-3a01-4c89-8e54-6895a838a423", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"# Function to creates a simple Keras model for classifying the MNIST images into 10 classes\n", | ||
"def create_model():\n", | ||
" return tf.keras.models.Sequential(\n", | ||
" [\n", | ||
" tf.keras.layers.Flatten(input_shape=(28, 28), name=\"layers_flatten\"),\n", | ||
" tf.keras.layers.Dense(512, activation=\"relu\", name=\"layers_dense\"),\n", | ||
" tf.keras.layers.Dropout(0.2, name=\"layers_dropout\"),\n", | ||
" tf.keras.layers.Dense(10, activation=\"softmax\", name=\"layers_dense_2\"),\n", | ||
" ]\n", | ||
" )" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "b6e3bb67-07ed-4eb4-9ad7-86d225c297c8", | ||
"metadata": {}, | ||
"source": [ | ||
"## Using TensorBoard with Keras Model.fit()\n", | ||
"\n", | ||
"When training with Keras's Model.fit(), adding the tf.keras.callbacks.TensorBoard callback ensures that logs are created and stored. Additionally, enable histogram computation every epoch with histogram_freq=1 (this is off by default)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 9, | ||
"id": "57e81e77-d152-422d-89bf-72d0ba1eea54", | ||
"metadata": {}, | ||
"outputs": [ | ||
{ | ||
"name": "stdout", | ||
"output_type": "stream", | ||
"text": [ | ||
"Epoch 1/5\n" | ||
] | ||
}, | ||
{ | ||
"name": "stderr", | ||
"output_type": "stream", | ||
"text": [ | ||
"/Users/ariefrahmansyah/Library/Caches/pypoetry/virtualenvs/applied-python-training-MLD32oJZ-py3.12/lib/python3.12/site-packages/keras/src/layers/reshaping/flatten.py:37: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n", | ||
" super().__init__(**kwargs)\n" | ||
] | ||
}, | ||
{ | ||
"name": "stdout", | ||
"output_type": "stream", | ||
"text": [ | ||
"\u001b[1m1875/1875\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m6s\u001b[0m 3ms/step - accuracy: 0.8920 - loss: 0.3704 - val_accuracy: 0.9680 - val_loss: 0.1032\n", | ||
"Epoch 2/5\n", | ||
"\u001b[1m1875/1875\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m6s\u001b[0m 3ms/step - accuracy: 0.9687 - loss: 0.1019 - val_accuracy: 0.9769 - val_loss: 0.0773\n", | ||
"Epoch 3/5\n", | ||
"\u001b[1m1875/1875\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m6s\u001b[0m 3ms/step - accuracy: 0.9793 - loss: 0.0664 - val_accuracy: 0.9794 - val_loss: 0.0663\n", | ||
"Epoch 4/5\n", | ||
"\u001b[1m1875/1875\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m6s\u001b[0m 3ms/step - accuracy: 0.9831 - loss: 0.0530 - val_accuracy: 0.9809 - val_loss: 0.0639\n", | ||
"Epoch 5/5\n", | ||
"\u001b[1m1875/1875\u001b[0m \u001b[32m━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[37m\u001b[0m \u001b[1m5s\u001b[0m 3ms/step - accuracy: 0.9862 - loss: 0.0428 - val_accuracy: 0.9800 - val_loss: 0.0671\n" | ||
] | ||
}, | ||
{ | ||
"data": { | ||
"text/plain": [ | ||
"<keras.src.callbacks.history.History at 0x168b35ee0>" | ||
] | ||
}, | ||
"execution_count": 9, | ||
"metadata": {}, | ||
"output_type": "execute_result" | ||
} | ||
], | ||
"source": [ | ||
"model = create_model()\n", | ||
"model.compile(\n", | ||
" optimizer=\"adam\", loss=\"sparse_categorical_crossentropy\", metrics=[\"accuracy\"]\n", | ||
")\n", | ||
"\n", | ||
"# Place the logs in a timestamped subdirectory to allow easy selection of different training runs.\n", | ||
"log_dir = \"tensorboard/logs/fit/\" + datetime.datetime.now().strftime(\"%Y%m%d-%H%M%S\")\n", | ||
"tensorboard_callback = tf.keras.callbacks.TensorBoard(log_dir=log_dir, histogram_freq=1)\n", | ||
"\n", | ||
"model.fit(\n", | ||
" x=x_train,\n", | ||
" y=y_train,\n", | ||
" epochs=5,\n", | ||
" validation_data=(x_test, y_test),\n", | ||
" callbacks=[tensorboard_callback],\n", | ||
")" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "c94c4b18-83b0-4592-9b3c-b4a606307702", | ||
"metadata": {}, | ||
"source": [ | ||
"Start TensorBoard through the command line or within a notebook experience. The two interfaces are generally the same. In notebooks, use the %tensorboard line magic. On the command line, run the same command without \"%\"." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 7, | ||
"id": "ac9d9cae-1daf-4759-9bd5-a6ac48bd0f41", | ||
"metadata": {}, | ||
"outputs": [ | ||
{ | ||
"data": { | ||
"text/html": [ | ||
"\n", | ||
" <iframe id=\"tensorboard-frame-714c27af5d134e4e\" width=\"100%\" height=\"800\" frameborder=\"0\">\n", | ||
" </iframe>\n", | ||
" <script>\n", | ||
" (function() {\n", | ||
" const frame = document.getElementById(\"tensorboard-frame-714c27af5d134e4e\");\n", | ||
" const url = new URL(\"/\", window.location);\n", | ||
" const port = 6006;\n", | ||
" if (port) {\n", | ||
" url.port = port;\n", | ||
" }\n", | ||
" frame.src = url;\n", | ||
" })();\n", | ||
" </script>\n", | ||
" " | ||
], | ||
"text/plain": [ | ||
"<IPython.core.display.HTML object>" | ||
] | ||
}, | ||
"metadata": {}, | ||
"output_type": "display_data" | ||
} | ||
], | ||
"source": [ | ||
"%tensorboard --logdir tensorboard/logs/fit" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "e83c3f5d-fe46-40ad-b3ae-c02c4c8bf86e", | ||
"metadata": {}, | ||
"source": [ | ||
"A brief overview of the visualizations created in this example and the dashboards (tabs in top navigation bar) where they can be found:\n", | ||
"\n", | ||
"* Scalars show how the loss and metrics change with every epoch. You can use them to also track training speed, learning rate, and other scalar values. Scalars can be found in the Time Series or Scalars dashboards.\n", | ||
"* Graphs help you visualize your model. In this case, the Keras graph of layers is shown which can help you ensure it is built correctly. Graphs can be found in the Graphs dashboard.\n", | ||
"* Histograms and Distributions show the distribution of a Tensor over time. This can be useful to visualize weights and biases and verify that they are changing in an expected way. Histograms can be found in the Time Series or Histograms dashboards. Distributions can be found in the Distributions dashboard.\n", | ||
"\n", | ||
"Additional TensorBoard dashboards are automatically enabled when you log other types of data. For example, the Keras TensorBoard callback lets you log images and embeddings as well. You can see what other dashboards are available in TensorBoard by clicking on the \"inactive\" dropdown towards the top right." | ||
] | ||
} | ||
], | ||
"metadata": { | ||
"kernelspec": { | ||
"display_name": "Python 3 (ipykernel)", | ||
"language": "python", | ||
"name": "python3" | ||
}, | ||
"language_info": { | ||
"codemirror_mode": { | ||
"name": "ipython", | ||
"version": 3 | ||
}, | ||
"file_extension": ".py", | ||
"mimetype": "text/x-python", | ||
"name": "python", | ||
"nbconvert_exporter": "python", | ||
"pygments_lexer": "ipython3", | ||
"version": "3.12.4" | ||
} | ||
}, | ||
"nbformat": 4, | ||
"nbformat_minor": 5 | ||
} |
Oops, something went wrong.