您当前的位置: 首页 > 

宝哥大数据

暂无认证

  • 1浏览

    0关注

    1029博文

    0收益

  • 0浏览

    0点赞

    0打赏

    0留言

私信
关注
热门博文

编译模型

宝哥大数据 发布时间:2019-12-18 10:45:58 ,浏览量:1

一、编译模型 1.1、使用内置API
model.compile(optimizer=keras.optimizers.RMSprop(learning_rate=1e-3),
              loss='sparse_categorical_crossentropy',
              metrics=['sparse_categorical_accuracy'])

有许多内置optimizer,loss,metircs可供使用,一般不用你自定义

  • Optimizers:
    • SGD() (with or without momentum)
    • RMSprop()
    • Adam()
    • etc.
  • Losses:
    • MeanSquaredError()
    • KLDivergence()
    • CosineSimilarity()
    • etc.
  • Metrics:
    • AUC()
    • Precision()
    • Recall()
    • etc.
1.2、自定义losses 1.2.1、第一种自定义loss

下面的例子显示了一个损失函数,它计算真实数据和预测之间的平均距离:

def TestCostomLosses():
    # 第一种自定义loss
    def basic_loss_function(y_true, y_pred):
        return tf.math.reduce_mean(y_true - y_pred)

    # 构建模型
    model = createModel()

    # 加载数据
    x_test, y_test, x_val, y_val, x_train, y_train = loadData()

    # 编译模型
    model.compile(optimizer=keras.optimizers.Adam(),
                  loss=basic_loss_function)
    # 训练模型
    model.fit(x_train, y_train, batch_size=64, epochs=3)
1.2.2、第二种自定义loss
class WeightedBinaryCrossEntropy(keras.losses.Loss):
    """
    Args:
      pos_weight: Scalar to affect the positive labels of the loss function.
      weight: Scalar to affect the entirety of the loss function.
      from_logits: Whether to compute loss form logits or the probability.
      reduction: Type of tf.keras.losses.Reduction to apply to loss.
      name: Name of the loss function.
    """
    def __init__(self, pos_weight, weight, from_logits=False,
                 reduction=keras.losses.Reduction.AUTO,
                 name='weighted_binary_crossentropy'):
        super(WeightedBinaryCrossEntropy, self).__init__(reduction=reduction,
                                                         name=name)
        self.pos_weight = pos_weight
        self.weight = weight
        self.from_logits = from_logits

    def call(self, y_true, y_pred):
        if not self.from_logits:
            # Manually calculate the weighted cross entropy.
            # Formula is qz * -log(sigmoid(x)) + (1 - z) * -log(1 - sigmoid(x))
            # where z are labels, x is logits, and q is the weight.
            # Since the values passed are from sigmoid (assuming in this case)
            # sigmoid(x) will be replaced by y_pred

            # qz * -log(sigmoid(x)) 1e-6 is added as an epsilon to stop passing a zero into the log
            x_1 = y_true * self.pos_weight * -tf.math.log(y_pred + 1e-6)

            # (1 - z) * -log(1 - sigmoid(x)). Epsilon is added to prevent passing a zero into the log
            x_2 = (1 - y_true) * -tf.math.log(1 - y_pred + 1e-6)

            return tf.add(x_1, x_2) * self.weight

            # Use built in function
        return tf.nn.weighted_cross_entropy_with_logits(y_true, y_pred, self.pos_weight) * self.weight

def TestCostomLoss2():
    model = createModel()
    model.compile(optimizer=keras.optimizers.Adam(),
                  loss=WeightedBinaryCrossEntropy(0.5, 2))
    # 加载数据
    x_test, y_test, x_val, y_val, x_train, y_train = loadData()

    model.fit(x_train, y_train, batch_size=64, epochs=3)
关注
打赏
1587549273
查看更多评论
立即登录/注册

微信扫码登录

0.0394s