Electron microscopy
Sequential API
- Python for Integrated Circuits -
- An Online Book -
Python for Integrated Circuits                                                                                   http://www.globalsino.com/ICs/        

Chapter/Index: Introduction | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | Appendix


A Sequential model is appropriate for a plain stack of layers where each layer has exactly one input tensor and one output tensor. Keras Sequential API is the easiest way to get up and running with Keras, but it’s also the most limited, that is you cannot create models that:
          i) Share layers.
          ii) Have branches (at least not easily).
          iii) Have multiple inputs.
          iv) Have multiple outputs.

Sequential API can be used to create a Keras model with TensorFlow (e.g. on Vertex AI platform). Then, the Keras Sequential API and Feature Columns can be used to build deep neural network (DNN). In combination with a trained model, the Keras model can be saved, loaded and deployed, and then the model can be called for making predictions.

Unlike the Keras Sequential API, we have to provide the shape of the input to the Keras Functional API.

The following format shows the stacking layers with Keras Sequential model:
          # Define Sequential model with 3 layers
          model = keras.Sequential(
          Input(shape = (64,))
          layers.Dense(32, activation="relu", name="layer1"),
          layers.Dense(8, activation="relu", name="layer2"),
          layers.Dense(1, activation="linear", name="output")
          # Call model on a test input
          x = tf.ones((3, 3))
          y = model(x)

An example of deep neural networks can be:
          (X_train, y_train) , (X_test, y_test) = keras.datasets.mnist.load_data()
          # Define a model
          myModel = tf.keras.models.Sequential([
          tf.keras.layers.Dense(128, activation = "relu"),
          tf.keras.layers.Dense(128, activation = "relu"),
          tf.keras.layers.Dense(128, activation = "relu"),
          tf.keras.layers.Dense(10, activation = "softmax")