Python tensorflow.keras.layers.GlobalAveragePooling1D() Examples
The following are 2
code examples of tensorflow.keras.layers.GlobalAveragePooling1D().
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example.
You may also want to check out all available functions/classes of the module
tensorflow.keras.layers
, or try the search function
.
Example #1
Source File: model.py From cloudml-samples with Apache License 2.0 | 6 votes |
def keras_estimator(model_dir, config, learning_rate, vocab_size): """Creates a Keras Sequential model with layers. Args: model_dir: (str) file path where training files will be written. config: (tf.estimator.RunConfig) Configuration options to save model. learning_rate: (int) Learning rate. vocab_size: (int) Size of the vocabulary in number of words. Returns: A keras.Model """ model = models.Sequential() model.add(Embedding(vocab_size, 16)) model.add(GlobalAveragePooling1D()) model.add(Dense(16, activation=tf.nn.relu)) model.add(Dense(1, activation=tf.nn.sigmoid)) # Compile model with learning parameters. optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate) model.compile( optimizer=optimizer, loss='binary_crossentropy', metrics=['accuracy']) estimator = tf.keras.estimator.model_to_estimator( keras_model=model, model_dir=model_dir, config=config) return estimator
Example #2
Source File: embedding.py From nlp-journey with Apache License 2.0 | 5 votes |
def _build_model(self): model = keras.Sequential([ layers.Embedding(self.encoder.vocab_size, self.embedding_dim), layers.GlobalAveragePooling1D(), layers.Dense(16, activation='relu'), layers.Dense(1) ]) model.compile(optimizer='adam', loss=tf.keras.losses.BinaryCrossentropy(from_logits=True), metrics=['accuracy']) model.summary() return model