生成对抗网络(GAN)入门

  •    🍨 本文为🔗365天深度学习训练营 中的学习记录博客
  • 🍦 参考文章:TensorFlow入门实战|第3周:天气识别
  • 🍖 原作者:K同学啊|接辅导、项目定制

一、理论基础

1.什么是GAN
GAN(生成对抗网络)是一种神经网络架构,它的设计灵感来自于博弈论的思想。这种网络由两个关键组件组成:生成器(Generator)和判别器(Discriminator)。生成器的任务是从随机噪声中生成与真实数据相似的合成样本,而判别器则负责辨别给定的样本是真实数据还是生成器产生的人工样本。这两个组件通过反复对抗性的训练相互影响和提升。

在训练过程中,生成器努力生成越来越逼真的样本,而判别器则不断提高辨别真实与生成样本的能力。这种博弈过程推动着两者的不断进步,直到最终生成器生成的样本足够逼真,使得判别器无法有效地区分真实和生成的样本。这就达到了一个状态,即生成器能够以惊人的逼真度产生伪造的真实样本。

GAN的应用广泛,包括图像生成、风格转换、图像编辑等领域,其独特的博弈训练方法使其在生成高质量数据方面表现出色。

2.什么是生成器
在生成对抗网络(GANs)中,生成器(G)是一个关键组件,其任务是利用随机噪声(通常表示为z)作为输入,并通过不断的学习和拟合过程生成一个与真实样本在尺寸和分布上相似的伪造样本G(z)。生成器本质上是一种生成式方法的模型,它通过学习数据的分布和分布参数来生成新的样本。

从数学的角度来看,生成式方法首先对数据的显式或隐含变量进行分布假设,然后通过将真实数据输入模型并训练变量和参数,最终得到一个学习后的近似分布。这个学习后的分布可以被用来生成新的数据。与传统的数学方法不同,生成器使用机器学习的方法,通过不断学习真实数据并修正模型,最终得到一个可以执行样本生成任务的学习后模型,这一过程可能相对较为直观。

生成器通过借助现有数据生成新数据,例如从随机产生的一组数字向量(称为潜在空间 latent space)中生成图像、音频等数据。在构建生成器时,首先需要明确生成的目标,然后将生成的结果传递给判别器网络进行进一步处理。这协同的过程在训练中推动生成器生成越来越逼真的样本。

3.什么是判别器
在生成对抗网络(GANs)中,判别器(D)是另一个关键的组件,其任务是对于输入的样本x输出一个介于[0,1]之间的概率数值D(x)。这个样本x可能是来自原始数据集中的真实样本,也可能是由生成器G生成的人工样本G(z)。通常的约定是,概率值D(x)越接近于1,表示此样本为真实样本的可能性越大;反之,概率值越小,则代表此样本为伪造样本的可能性越大。判别器因此是一个二分类的神经网络分类器,其目标是区分输入样本的真伪,而非判定样本的原始类别。这表明GAN是一个无监督学习过程,没有使用样本的类别信息。

判别器的任务是尝试区分接收到的数据是真实数据还是由生成网络生成的数据。它根据预定义的类别对输入进行分类,通常在GAN中是进行二分类。判别器的输出结果是一个介于0和1之间的数字,用来表示当前输入被认为是真实数据的可能性。当判别结果为1时,判别器认为输入来自真实数据;反之,如果判别结果接近0,则判别器将其视为生成数据。判别器的目标是不断优化以在这个对抗性的过程中更准确地区分真实和生成的样本。

4.原理
生成对抗网络(GAN)是博弈论和机器学习相结合的创新性产物,由Ian Goodfellow于2014年提出。这一算法的问世引起了广泛的研究热潮,表明人们对这种算法的认可和热切的研究兴趣。

研究者最初尝试通过计算机来实现自动生成数据的功能。早期的生成算法通常采用均方误差作为损失函数来衡量生成图片和真实图片之间的差距。然而,研究者发现,有时两张生成图片的均方误差相同,但它们的视觉效果却迥然不同。鉴于这种不足,Ian Goodfellow提出了生成对抗网络。

GAN的核心思想是由两个模型组成:生成模型(G)和判别模型(D)。生成模型首先接收随机噪声z作为输入,生成一张初级图片。然后,训练一代判别模型(D)进行二分类操作,将生成的图片判别为0,真实图片判别为1。为了欺骗一代判别器,生成模型开始优化。随着一代生成模型的进步,它成功欺骗判别模型1D,然后判别模型也会优化更新,升级为2D。这样的迭代过程不断进行,生成模型和判别模型相互对抗、相互优化,生成越来越逼真的样本,形成了一个动态的博弈过程。

GAN的原理在于通过这种对抗性训练,生成器学习生成逼真的数据,而判别器学习更好地区分真实和生成的数据。这种博弈过程使得GAN在生成高质量、逼真的数据方面取得了显著的成功。

二、模型搭建与训练 

import os
import numpy as np
import torch
import torch.nn as nn
from torch.autograd import Variable
from torch.utils.data import DataLoader
from torchvision import datasets
from torchvision.utils import save_image
import torchvision.transforms as transforms

# 创建文件夹
os.makedirs('./output/images/', exist_ok=True)
os.makedirs('./output/', exist_ok=True)
os.makedirs('./data/MNIST/', exist_ok=True)

# 超参数配置
n_epochs = 50
batch_size = 64
lr = 0.0002
b1 = 0.5
b2 = 0.999
latent_dim = 100
img_size = 28
channels = 1
sample_interval = 500

# 图像的尺寸:(1, 28, 28),和图像的像素面积:(784)
img_shape = (channels, img_size, img_size)
img_area = np.prod(img_shape)

# 设定设备为 CPU
device = torch.device('cpu')

# mnist数据集下载
mnist = datasets.MNIST(root='./data/',
                       train=True,
                       download=True,
                       transform=transforms.Compose([
                           transforms.Resize(img_size),
                           transforms.ToTensor(),
                           transforms.Normalize([0.5], [0.5])
                       ]))
# 配置数据到加载器
dataloader = DataLoader(mnist, batch_size=batch_size, shuffle=True)

# 定义判别器
class Discriminator(nn.Module):
    def __init__(self):
        super(Discriminator, self).__init__()
        self.model = nn.Sequential(
            nn.Linear(img_area, 512),
            nn.LeakyReLU(0.2, inplace=True),
            nn.Linear(512, 256),
            nn.LeakyReLU(0.2, inplace=True),
            nn.Linear(256, 1),
            nn.Sigmoid())

    def forward(self, img):
        img_flat = img.view(img.size(0), -1)
        validity = self.model(img_flat)
        return validity

# 定义生成器
class Generator(nn.Module):
    def __init__(self):
        super(Generator, self).__init__()
        def block(in_feat, out_feat, normalize=True):
            layers = [nn.Linear(in_feat, out_feat)]
            if normalize:
                layers.append(nn.BatchNorm1d(out_feat, 0.8))
            layers.append(nn.LeakyReLU(0.2, inplace=True))
            return layers
        self.model = nn.Sequential(
            *block(latent_dim, 128, normalize=False),
            *block(128, 256),
            *block(256, 512),
            *block(512, 1024),
            nn.Linear(1024, img_area),
            nn.Tanh())

    def forward(self, z):
        imgs = self.model(z)
        imgs = imgs.view(imgs.size(0), *img_shape)
        return imgs

# 创建生成器和判别器对象
generator = Generator().to(device)
discriminator = Discriminator().to(device)
criterion = torch.nn.BCELoss()

# 定义优化器
optimizer_G = torch.optim.Adam(generator.parameters(), lr=lr, betas=(b1, b2))
optimizer_D = torch.optim.Adam(discriminator.parameters(), lr=lr, betas=(b1, b2))

# 进行多个epoch的训练
for epoch in range(n_epochs):
    for i, (imgs, _) in enumerate(dataloader):
        imgs = imgs.view(imgs.size(0), -1).to(device)
        real_img = Variable(imgs)
        real_label = Variable(torch.ones(imgs.size(0), 1)).to(device)
        fake_label = Variable(torch.zeros(imgs.size(0), 1)).to(device)

        # 训练判别器
        real_out = discriminator(real_img)
        loss_real_D = criterion(real_out, real_label)
        z = Variable(torch.randn(imgs.size(0), latent_dim)).to(device)
        fake_img = generator(z).detach()
        fake_out = discriminator(fake_img)
        loss_fake_D = criterion(fake_out, fake_label)
        loss_D = loss_real_D + loss_fake_D
        optimizer_D.zero_grad()
        loss_D.backward()
        optimizer_D.step()

        # 训练生成器
        fake_img = generator(z)
        output = discriminator(fake_img)
        loss_G = criterion(output, real_label)
        optimizer_G.zero_grad()
        loss_G.backward()
        optimizer_G.step()

        if (i + 1) % 300 == 0:
            print(f"[Epoch {epoch}/{n_epochs}] [Batch {i}/{len(dataloader)}] [D loss: {loss_D.item()}] [G loss: {loss_G.item()}] [D real: {real_out.data.mean()}] [D fake: {fake_out.data.mean()}]")

        batches_done = epoch * len(dataloader) + i
        if batches_done % sample_interval == 0:
            save_image(fake_img.data[:25], f"./output/images/{batches_done}.png", nrow=5, normalize=True)

# 保存模型
torch.save(generator.state_dict(), './output/generator.pth')
torch.save(discriminator.state_dict(), './output/discriminator.pth')

运行结果: 

[Epoch 0/50] [Batch 299/938] [D loss: 1.2350534200668335] [G loss: 0.6982573866844177] [D real: 0.486375093460083] [D fake: 0.3856542110443115]     
[Epoch 0/50] [Batch 599/938] [D loss: 1.055356740951538] [G loss: 1.1749478578567505] [D real: 0.608492910861969] [D fake: 0.40936288237571716]     
[Epoch 0/50] [Batch 899/938] [D loss: 0.7502129673957825] [G loss: 1.5634181499481201] [D real: 0.7147619724273682] [D fake: 0.30552658438682556]   
[Epoch 1/50] [Batch 299/938] [D loss: 0.9411146640777588] [G loss: 2.2785043716430664] [D real: 0.8037294149398804] [D fake: 0.49536657333374023]   
[Epoch 1/50] [Batch 599/938] [D loss: 0.7559429407119751] [G loss: 2.0363597869873047] [D real: 0.7389096021652222] [D fake: 0.3309986889362335]    
[Epoch 1/50] [Batch 899/938] [D loss: 0.834088921546936] [G loss: 1.5646687746047974] [D real: 0.5999565124511719] [D fake: 0.15042567253112793]    
[Epoch 2/50] [Batch 299/938] [D loss: 0.6064201593399048] [G loss: 2.0028223991394043] [D real: 0.6977243423461914] [D fake: 0.13676750659942627]   
[Epoch 2/50] [Batch 599/938] [D loss: 1.0512481927871704] [G loss: 2.198730945587158] [D real: 0.83066725730896] [D fake: 0.5548945665359497]       
[Epoch 2/50] [Batch 899/938] [D loss: 0.9007444381713867] [G loss: 1.509855031967163] [D real: 0.6789819002151489] [D fake: 0.3269215524196625]     
[Epoch 3/50] [Batch 299/938] [D loss: 1.0143460035324097] [G loss: 2.80765438079834] [D real: 0.8554768562316895] [D fake: 0.5442554950714111]      
[Epoch 3/50] [Batch 599/938] [D loss: 0.8227330446243286] [G loss: 1.4076086282730103] [D real: 0.6630716919898987] [D fake: 0.2709047496318817]    
[Epoch 3/50] [Batch 899/938] [D loss: 0.7104871273040771] [G loss: 2.7154183387756348] [D real: 0.8923393487930298] [D fake: 0.42268577218055725]   
[Epoch 4/50] [Batch 299/938] [D loss: 0.8367196917533875] [G loss: 1.7840416431427002] [D real: 0.7257375717163086] [D fake: 0.3309652805328369]    
[Epoch 4/50] [Batch 599/938] [D loss: 0.921514093875885] [G loss: 2.3290669918060303] [D real: 0.8575055599212646] [D fake: 0.4876272976398468]     
[Epoch 4/50] [Batch 899/938] [D loss: 0.8092324137687683] [G loss: 1.9644924402236938] [D real: 0.800819993019104] [D fake: 0.3975276052951813]     
[Epoch 5/50] [Batch 299/938] [D loss: 1.0172423124313354] [G loss: 1.1513457298278809] [D real: 0.5393208861351013] [D fake: 0.1033107191324234]    
[Epoch 5/50] [Batch 599/938] [D loss: 0.6319830417633057] [G loss: 2.149918794631958] [D real: 0.7879904508590698] [D fake: 0.27086150646209717]    
[Epoch 5/50] [Batch 899/938] [D loss: 0.7417478561401367] [G loss: 2.3792052268981934] [D real: 0.8150661587715149] [D fake: 0.3646332621574402]    
[Epoch 6/50] [Batch 299/938] [D loss: 0.9981493353843689] [G loss: 1.0874526500701904] [D real: 0.5385660529136658] [D fake: 0.07572783529758453]   
[Epoch 6/50] [Batch 599/938] [D loss: 0.9341475963592529] [G loss: 1.1257842779159546] [D real: 0.6259070038795471] [D fake: 0.19128751754760742]   
[Epoch 6/50] [Batch 899/938] [D loss: 0.7261288166046143] [G loss: 2.4304232597351074] [D real: 0.8604062795639038] [D fake: 0.38279032707214355]   
[Epoch 7/50] [Batch 299/938] [D loss: 0.7211419343948364] [G loss: 1.7635079622268677] [D real: 0.7256969213485718] [D fake: 0.2681616246700287]    
[Epoch 7/50] [Batch 599/938] [D loss: 0.8462427854537964] [G loss: 2.013465166091919] [D real: 0.8659587502479553] [D fake: 0.4707939624786377]     
[Epoch 7/50] [Batch 899/938] [D loss: 0.874700129032135] [G loss: 1.412750244140625] [D real: 0.703620433807373] [D fake: 0.33509835600852966]      
[Epoch 8/50] [Batch 299/938] [D loss: 0.9936412572860718] [G loss: 1.937303900718689] [D real: 0.8226107358932495] [D fake: 0.5015699863433838]     
[Epoch 8/50] [Batch 599/938] [D loss: 0.9673251509666443] [G loss: 1.0482304096221924] [D real: 0.5971685647964478] [D fake: 0.2616860270500183]    
[Epoch 8/50] [Batch 899/938] [D loss: 0.9958365559577942] [G loss: 1.6240371465682983] [D real: 0.7205338478088379] [D fake: 0.38011816143989563]   
[Epoch 9/50] [Batch 299/938] [D loss: 0.8471299409866333] [G loss: 1.247656226158142] [D real: 0.6149693727493286] [D fake: 0.1839817613363266]     
[Epoch 9/50] [Batch 599/938] [D loss: 1.0142803192138672] [G loss: 1.0834134817123413] [D real: 0.5832415819168091] [D fake: 0.24945148825645447]   
[Epoch 9/50] [Batch 899/938] [D loss: 0.7995563745498657] [G loss: 1.8723337650299072] [D real: 0.7556633949279785] [D fake: 0.3428336977958679]    
[Epoch 10/50] [Batch 299/938] [D loss: 0.8716166615486145] [G loss: 1.1147598028182983] [D real: 0.650393545627594] [D fake: 0.26904982328414917]   
[Epoch 10/50] [Batch 599/938] [D loss: 1.14439058303833] [G loss: 0.649280846118927] [D real: 0.47541874647140503] [D fake: 0.1174377053976059]     
[Epoch 10/50] [Batch 899/938] [D loss: 0.7188675403594971] [G loss: 2.1484873294830322] [D real: 0.8237780332565308] [D fake: 0.3629501461982727]   
[Epoch 11/50] [Batch 299/938] [D loss: 1.27309250831604] [G loss: 0.5796881914138794] [D real: 0.4333685338497162] [D fake: 0.14252831041812897]    
[Epoch 11/50] [Batch 599/938] [D loss: 1.0187960863113403] [G loss: 1.8960413932800293] [D real: 0.6672532558441162] [D fake: 0.3642686605453491]   
[Epoch 11/50] [Batch 899/938] [D loss: 0.9516346454620361] [G loss: 1.1471421718597412] [D real: 0.6326779127120972] [D fake: 0.2814021706581116]   
[Epoch 12/50] [Batch 299/938] [D loss: 1.121228575706482] [G loss: 0.7069916129112244] [D real: 0.5016646981239319] [D fake: 0.19652320444583893]   
[Epoch 12/50] [Batch 599/938] [D loss: 0.9946940541267395] [G loss: 2.1766204833984375] [D real: 0.8234321475028992] [D fake: 0.5144876837730408]   
[Epoch 12/50] [Batch 899/938] [D loss: 1.0159285068511963] [G loss: 1.2988619804382324] [D real: 0.6870322823524475] [D fake: 0.369596004486084]    
[Epoch 13/50] [Batch 299/938] [D loss: 0.8661847114562988] [G loss: 1.2607330083847046] [D real: 0.6748418211936951] [D fake: 0.29538118839263916]  
[Epoch 13/50] [Batch 599/938] [D loss: 1.3979634046554565] [G loss: 2.3457212448120117] [D real: 0.8795671463012695] [D fake: 0.6639649868011475]   
[Epoch 13/50] [Batch 899/938] [D loss: 1.1415414810180664] [G loss: 0.998738169670105] [D real: 0.5655196309089661] [D fake: 0.2961214482784271]    
[Epoch 14/50] [Batch 299/938] [D loss: 0.8847220540046692] [G loss: 1.4369701147079468] [D real: 0.656835675239563] [D fake: 0.27129489183425903]   
[Epoch 14/50] [Batch 599/938] [D loss: 0.9689455032348633] [G loss: 2.008836030960083] [D real: 0.7757049202919006] [D fake: 0.45278847217559814]   
[Epoch 14/50] [Batch 899/938] [D loss: 1.024115800857544] [G loss: 1.0999586582183838] [D real: 0.5540661811828613] [D fake: 0.22688642144203186]   
[Epoch 15/50] [Batch 299/938] [D loss: 0.8648720979690552] [G loss: 1.837003469467163] [D real: 0.710908055305481] [D fake: 0.32450997829437256]    
[Epoch 15/50] [Batch 599/938] [D loss: 0.9275979995727539] [G loss: 1.5531091690063477] [D real: 0.7024131417274475] [D fake: 0.36999034881591797]  
[Epoch 15/50] [Batch 899/938] [D loss: 1.0830230712890625] [G loss: 1.4858040809631348] [D real: 0.6036790013313293] [D fake: 0.3062855303287506]   
[Epoch 16/50] [Batch 299/938] [D loss: 1.2152941226959229] [G loss: 0.6983291506767273] [D real: 0.45325058698654175] [D fake: 0.1501416563987732]  
[Epoch 16/50] [Batch 599/938] [D loss: 1.0267713069915771] [G loss: 1.2408125400543213] [D real: 0.6347638368606567] [D fake: 0.35814952850341797]  
[Epoch 16/50] [Batch 899/938] [D loss: 0.8941770792007446] [G loss: 1.119341254234314] [D real: 0.6497822999954224] [D fake: 0.3000938296318054]    
[Epoch 17/50] [Batch 299/938] [D loss: 1.0820164680480957] [G loss: 2.1917734146118164] [D real: 0.8169276118278503] [D fake: 0.5367435216903687]   
[Epoch 17/50] [Batch 599/938] [D loss: 1.1285486221313477] [G loss: 0.9780616164207458] [D real: 0.5910371541976929] [D fake: 0.3358427882194519]   
[Epoch 17/50] [Batch 899/938] [D loss: 1.1013814210891724] [G loss: 1.5073280334472656] [D real: 0.6916764974594116] [D fake: 0.4306653141975403]   
[Epoch 18/50] [Batch 299/938] [D loss: 1.1292989253997803] [G loss: 0.9246511459350586] [D real: 0.6083661317825317] [D fake: 0.39105045795440674]  
[Epoch 18/50] [Batch 599/938] [D loss: 0.9190876483917236] [G loss: 1.0186855792999268] [D real: 0.6566669344902039] [D fake: 0.3293992280960083]   
[Epoch 18/50] [Batch 899/938] [D loss: 0.9438257217407227] [G loss: 1.538163423538208] [D real: 0.7003622651100159] [D fake: 0.3908306956291199]    
[Epoch 19/50] [Batch 299/938] [D loss: 0.9675836563110352] [G loss: 1.2700777053833008] [D real: 0.6152490377426147] [D fake: 0.2947152554988861]   
[Epoch 19/50] [Batch 599/938] [D loss: 0.9217583537101746] [G loss: 0.9447553157806396] [D real: 0.5978091955184937] [D fake: 0.24676361680030823]  
[Epoch 19/50] [Batch 899/938] [D loss: 0.9549309015274048] [G loss: 1.183466911315918] [D real: 0.6619632244110107] [D fake: 0.31366482377052307]   
[Epoch 20/50] [Batch 299/938] [D loss: 0.8573518991470337] [G loss: 1.5201596021652222] [D real: 0.6769440770149231] [D fake: 0.28666192293167114]  
[Epoch 20/50] [Batch 599/938] [D loss: 0.9535715579986572] [G loss: 1.323699951171875] [D real: 0.6168364882469177] [D fake: 0.27342545986175537]   
[Epoch 20/50] [Batch 899/938] [D loss: 0.902020275592804] [G loss: 0.991980791091919] [D real: 0.6241742968559265] [D fake: 0.27164947986602783]    
[Epoch 21/50] [Batch 299/938] [D loss: 1.0286413431167603] [G loss: 1.060232400894165] [D real: 0.6510547399520874] [D fake: 0.371548056602478]     
[Epoch 21/50] [Batch 599/938] [D loss: 1.0256130695343018] [G loss: 1.0313621759414673] [D real: 0.6192348599433899] [D fake: 0.3256581127643585]   
[Epoch 21/50] [Batch 899/938] [D loss: 1.032410979270935] [G loss: 2.0093441009521484] [D real: 0.802362322807312] [D fake: 0.5006930232048035]     
[Epoch 22/50] [Batch 299/938] [D loss: 0.854385495185852] [G loss: 1.4936842918395996] [D real: 0.681969940662384] [D fake: 0.28983962535858154]    
[Epoch 22/50] [Batch 599/938] [D loss: 0.8878469467163086] [G loss: 1.0988521575927734] [D real: 0.6428740620613098] [D fake: 0.2834092080593109]   
[Epoch 22/50] [Batch 899/938] [D loss: 0.989640474319458] [G loss: 1.1327638626098633] [D real: 0.6306523680686951] [D fake: 0.3283930718898773]    
[Epoch 23/50] [Batch 299/938] [D loss: 0.8966113924980164] [G loss: 1.5586148500442505] [D real: 0.7475281953811646] [D fake: 0.3886389434337616]   
[Epoch 23/50] [Batch 599/938] [D loss: 1.0263727903366089] [G loss: 2.0425281524658203] [D real: 0.782462477684021] [D fake: 0.4761490225791931]    
[Epoch 23/50] [Batch 899/938] [D loss: 0.9627300500869751] [G loss: 1.341469645500183] [D real: 0.6781212091445923] [D fake: 0.37951239943504333]   
[Epoch 24/50] [Batch 299/938] [D loss: 1.1018644571304321] [G loss: 1.5438084602355957] [D real: 0.7314162850379944] [D fake: 0.46355438232421875]  
[Epoch 24/50] [Batch 599/938] [D loss: 0.9099334478378296] [G loss: 1.2091448307037354] [D real: 0.6896028518676758] [D fake: 0.3326546847820282]   
[Epoch 24/50] [Batch 899/938] [D loss: 1.1444112062454224] [G loss: 0.7966321706771851] [D real: 0.4846753180027008] [D fake: 0.18288815021514893]  
[Epoch 25/50] [Batch 299/938] [D loss: 0.9260637760162354] [G loss: 1.5650575160980225] [D real: 0.7048342227935791] [D fake: 0.3776768743991852]   
[Epoch 25/50] [Batch 599/938] [D loss: 0.9959667921066284] [G loss: 1.2941513061523438] [D real: 0.6270737051963806] [D fake: 0.30052483081817627]  
[Epoch 25/50] [Batch 899/938] [D loss: 1.1650434732437134] [G loss: 0.7404733896255493] [D real: 0.5191278457641602] [D fake: 0.23394332826137543]  
[Epoch 26/50] [Batch 299/938] [D loss: 1.0800296068191528] [G loss: 0.9680988192558289] [D real: 0.6005172729492188] [D fake: 0.32223257422447205]  
[Epoch 26/50] [Batch 599/938] [D loss: 1.1890555620193481] [G loss: 0.6886818408966064] [D real: 0.4532957673072815] [D fake: 0.15304003655910492]  
[Epoch 26/50] [Batch 899/938] [D loss: 1.1685961484909058] [G loss: 1.8664028644561768] [D real: 0.8364435434341431] [D fake: 0.5814744234085083]   
[Epoch 27/50] [Batch 299/938] [D loss: 0.8339351415634155] [G loss: 1.5597097873687744] [D real: 0.7150497436523438] [D fake: 0.31568029522895813]  
[Epoch 27/50] [Batch 599/938] [D loss: 0.9944645166397095] [G loss: 1.395769476890564] [D real: 0.6818631887435913] [D fake: 0.386187881231308]     
[Epoch 27/50] [Batch 899/938] [D loss: 0.9830508232116699] [G loss: 1.3436462879180908] [D real: 0.6536321043968201] [D fake: 0.3324795067310333]   
[Epoch 28/50] [Batch 299/938] [D loss: 0.9554051756858826] [G loss: 1.4756137132644653] [D real: 0.7433133125305176] [D fake: 0.4035921096801758]   
[Epoch 28/50] [Batch 599/938] [D loss: 0.9320023059844971] [G loss: 1.3813360929489136] [D real: 0.7029383778572083] [D fake: 0.3459252119064331]   
[Epoch 28/50] [Batch 899/938] [D loss: 0.9086618423461914] [G loss: 1.4045188426971436] [D real: 0.635227620601654] [D fake: 0.2691218852996826]    
[Epoch 29/50] [Batch 299/938] [D loss: 0.930955171585083] [G loss: 1.3028247356414795] [D real: 0.6660034656524658] [D fake: 0.3072052001953125]    
[Epoch 29/50] [Batch 599/938] [D loss: 0.9306515455245972] [G loss: 1.0087703466415405] [D real: 0.5827988982200623] [D fake: 0.21247431635856628]  
[Epoch 29/50] [Batch 899/938] [D loss: 0.9420373439788818] [G loss: 1.1675053834915161] [D real: 0.5531575083732605] [D fake: 0.15052053332328796]  
[Epoch 30/50] [Batch 299/938] [D loss: 0.9591331481933594] [G loss: 1.1839848756790161] [D real: 0.6546627283096313] [D fake: 0.3189544975757599]   
[Epoch 30/50] [Batch 599/938] [D loss: 1.0712635517120361] [G loss: 2.0601842403411865] [D real: 0.7893093824386597] [D fake: 0.4943881928920746]   
[Epoch 30/50] [Batch 899/938] [D loss: 0.9131834506988525] [G loss: 1.0398011207580566] [D real: 0.6069164872169495] [D fake: 0.24050579965114594]  
[Epoch 31/50] [Batch 299/938] [D loss: 1.0883607864379883] [G loss: 2.1485860347747803] [D real: 0.7985849976539612] [D fake: 0.5069066286087036]   
[Epoch 31/50] [Batch 599/938] [D loss: 0.9745715856552124] [G loss: 1.4295800924301147] [D real: 0.7000250816345215] [D fake: 0.37375637888908386]  
[Epoch 31/50] [Batch 899/938] [D loss: 0.8729394674301147] [G loss: 1.9546561241149902] [D real: 0.7736930847167969] [D fake: 0.40229207277297974]  
[Epoch 32/50] [Batch 299/938] [D loss: 1.2292734384536743] [G loss: 1.549853801727295] [D real: 0.7569393515586853] [D fake: 0.5234301686286926]    
[Epoch 32/50] [Batch 599/938] [D loss: 1.0344629287719727] [G loss: 2.7325339317321777] [D real: 0.8748428821563721] [D fake: 0.5330327749252319]   
[Epoch 32/50] [Batch 899/938] [D loss: 0.8796753883361816] [G loss: 1.1598433256149292] [D real: 0.6687930822372437] [D fake: 0.27408963441848755]  
[Epoch 33/50] [Batch 299/938] [D loss: 1.1222456693649292] [G loss: 2.5464987754821777] [D real: 0.8294387459754944] [D fake: 0.5251374244689941]   
[Epoch 33/50] [Batch 599/938] [D loss: 0.8337017297744751] [G loss: 1.5257548093795776] [D real: 0.7527210116386414] [D fake: 0.34428730607032776]  
[Epoch 33/50] [Batch 899/938] [D loss: 0.8654632568359375] [G loss: 1.5247968435287476] [D real: 0.6634738445281982] [D fake: 0.27118560671806335]  
[Epoch 34/50] [Batch 299/938] [D loss: 0.8042980432510376] [G loss: 1.37660551071167] [D real: 0.6886695027351379] [D fake: 0.26193633675575256]    
[Epoch 34/50] [Batch 599/938] [D loss: 0.883643388748169] [G loss: 2.06406307220459] [D real: 0.8143107891082764] [D fake: 0.42004501819610596]     
[Epoch 34/50] [Batch 899/938] [D loss: 0.9017606973648071] [G loss: 1.6588715314865112] [D real: 0.7001490592956543] [D fake: 0.31763893365859985]  
[Epoch 35/50] [Batch 299/938] [D loss: 1.202994465827942] [G loss: 0.7202847003936768] [D real: 0.45607709884643555] [D fake: 0.14036396145820618]  
[Epoch 35/50] [Batch 599/938] [D loss: 1.0403707027435303] [G loss: 2.3267641067504883] [D real: 0.8316076993942261] [D fake: 0.4976714253425598]   
[Epoch 35/50] [Batch 899/938] [D loss: 0.9158962965011597] [G loss: 1.5470354557037354] [D real: 0.7522283792495728] [D fake: 0.37140563130378723]  
[Epoch 36/50] [Batch 299/938] [D loss: 0.9869617223739624] [G loss: 2.2973015308380127] [D real: 0.8436128497123718] [D fake: 0.4932475984096527]   
[Epoch 36/50] [Batch 599/938] [D loss: 0.8625720739364624] [G loss: 1.2063660621643066] [D real: 0.7057400941848755] [D fake: 0.3107881546020508]   
[Epoch 36/50] [Batch 899/938] [D loss: 0.9308415055274963] [G loss: 1.4096471071243286] [D real: 0.6190749406814575] [D fake: 0.18880674242973328]  
[Epoch 37/50] [Batch 299/938] [D loss: 0.8832848072052002] [G loss: 1.8554438352584839] [D real: 0.7570676803588867] [D fake: 0.3972577750682831]   
[Epoch 37/50] [Batch 599/938] [D loss: 0.9251106977462769] [G loss: 1.375625729560852] [D real: 0.6303102970123291] [D fake: 0.2522880733013153]    
[Epoch 37/50] [Batch 899/938] [D loss: 0.964485764503479] [G loss: 1.0902432203292847] [D real: 0.5885259509086609] [D fake: 0.19822385907173157]   
[Epoch 38/50] [Batch 299/938] [D loss: 0.9908133745193481] [G loss: 1.6421051025390625] [D real: 0.7105517387390137] [D fake: 0.39029496908187866]  
[Epoch 38/50] [Batch 599/938] [D loss: 0.7265905141830444] [G loss: 1.7473435401916504] [D real: 0.7558919787406921] [D fake: 0.27862119674682617]  
[Epoch 38/50] [Batch 899/938] [D loss: 1.0594215393066406] [G loss: 1.684019923210144] [D real: 0.7222703695297241] [D fake: 0.4507278800010681]    
[Epoch 39/50] [Batch 299/938] [D loss: 0.9159432053565979] [G loss: 1.0789000988006592] [D real: 0.6405486464500427] [D fake: 0.26420480012893677]  
[Epoch 39/50] [Batch 599/938] [D loss: 0.8127704858779907] [G loss: 1.2508443593978882] [D real: 0.685287356376648] [D fake: 0.25188061594963074]   
[Epoch 39/50] [Batch 899/938] [D loss: 0.8984431624412537] [G loss: 1.1624631881713867] [D real: 0.5990526080131531] [D fake: 0.22442740201950073]  
[Epoch 40/50] [Batch 299/938] [D loss: 0.8730635643005371] [G loss: 1.1256250143051147] [D real: 0.6206510066986084] [D fake: 0.2117980569601059]   
[Epoch 40/50] [Batch 599/938] [D loss: 0.899864912033081] [G loss: 1.9363987445831299] [D real: 0.7845451235771179] [D fake: 0.4057895243167877]    
[Epoch 40/50] [Batch 899/938] [D loss: 0.788878321647644] [G loss: 1.9694950580596924] [D real: 0.8118752241134644] [D fake: 0.38903263211250305]   
[Epoch 41/50] [Batch 299/938] [D loss: 1.0589418411254883] [G loss: 1.011269450187683] [D real: 0.5342588424682617] [D fake: 0.17791403830051422]   
[Epoch 41/50] [Batch 599/938] [D loss: 0.8985623121261597] [G loss: 1.45697820186615] [D real: 0.6747936010360718] [D fake: 0.2773504853248596]     
[Epoch 41/50] [Batch 899/938] [D loss: 0.9222961664199829] [G loss: 2.3203341960906982] [D real: 0.8031480312347412] [D fake: 0.4381536543369293]   
[Epoch 42/50] [Batch 299/938] [D loss: 0.924498438835144] [G loss: 1.440964698791504] [D real: 0.7294663786888123] [D fake: 0.386924147605896]      
[Epoch 42/50] [Batch 599/938] [D loss: 1.0899465084075928] [G loss: 1.871597409248352] [D real: 0.7736498117446899] [D fake: 0.4781801998615265]    
[Epoch 42/50] [Batch 899/938] [D loss: 0.8787288665771484] [G loss: 2.1557202339172363] [D real: 0.7813157439231873] [D fake: 0.3885624408721924]   
[Epoch 43/50] [Batch 299/938] [D loss: 0.7645089030265808] [G loss: 1.532172679901123] [D real: 0.6758894920349121] [D fake: 0.1813245713710785]    
[Epoch 43/50] [Batch 599/938] [D loss: 0.7722570300102234] [G loss: 1.7845128774642944] [D real: 0.6942862868309021] [D fake: 0.22502407431602478]  
[Epoch 43/50] [Batch 899/938] [D loss: 1.109108805656433] [G loss: 1.9074299335479736] [D real: 0.7986328601837158] [D fake: 0.5067358016967773]    
[Epoch 44/50] [Batch 299/938] [D loss: 0.8472399711608887] [G loss: 1.9521387815475464] [D real: 0.7659008502960205] [D fake: 0.37023526430130005]  
[Epoch 44/50] [Batch 599/938] [D loss: 0.9451081156730652] [G loss: 1.707533836364746] [D real: 0.7530910968780518] [D fake: 0.38833051919937134]   
[Epoch 44/50] [Batch 899/938] [D loss: 0.8926998376846313] [G loss: 1.5450968742370605] [D real: 0.7412923574447632] [D fake: 0.3699657917022705]   
[Epoch 45/50] [Batch 299/938] [D loss: 0.7799042463302612] [G loss: 1.662207841873169] [D real: 0.7865528464317322] [D fake: 0.346537321805954]     
[Epoch 45/50] [Batch 599/938] [D loss: 0.9085474014282227] [G loss: 0.9939476251602173] [D real: 0.6574568152427673] [D fake: 0.23760713636875153]  
[Epoch 45/50] [Batch 899/938] [D loss: 0.8590810894966125] [G loss: 1.942899227142334] [D real: 0.802588701248169] [D fake: 0.378329336643219]      
[Epoch 46/50] [Batch 299/938] [D loss: 1.0076061487197876] [G loss: 1.1324681043624878] [D real: 0.6353162527084351] [D fake: 0.28025197982788086]  
[Epoch 46/50] [Batch 599/938] [D loss: 0.8912186026573181] [G loss: 1.660722017288208] [D real: 0.678089439868927] [D fake: 0.2697850465774536]     
[Epoch 46/50] [Batch 899/938] [D loss: 0.7739388346672058] [G loss: 2.167283535003662] [D real: 0.793948769569397] [D fake: 0.3539980947971344]     
[Epoch 47/50] [Batch 299/938] [D loss: 1.0051631927490234] [G loss: 1.809553861618042] [D real: 0.7550152540206909] [D fake: 0.4090600311756134]    
[Epoch 47/50] [Batch 599/938] [D loss: 0.8563379645347595] [G loss: 1.0784591436386108] [D real: 0.640609860420227] [D fake: 0.23422126471996307]   
[Epoch 47/50] [Batch 899/938] [D loss: 0.7957481145858765] [G loss: 1.8897020816802979] [D real: 0.7726541757583618] [D fake: 0.31628531217575073]  
[Epoch 48/50] [Batch 299/938] [D loss: 0.8358625173568726] [G loss: 1.3485994338989258] [D real: 0.6844103336334229] [D fake: 0.2745668888092041]   
[Epoch 48/50] [Batch 599/938] [D loss: 0.7678362131118774] [G loss: 1.6824095249176025] [D real: 0.7712661027908325] [D fake: 0.34115567803382874]  
[Epoch 48/50] [Batch 899/938] [D loss: 0.7988683581352234] [G loss: 2.0199320316314697] [D real: 0.7913979887962341] [D fake: 0.36441725492477417]  
[Epoch 49/50] [Batch 299/938] [D loss: 0.945759117603302] [G loss: 2.0533831119537354] [D real: 0.8216241002082825] [D fake: 0.4464045763015747]    
[Epoch 49/50] [Batch 599/938] [D loss: 0.9638737440109253] [G loss: 2.1069891452789307] [D real: 0.8191846013069153] [D fake: 0.46080657839775085]  
[Epoch 49/50] [Batch 899/938] [D loss: 0.7438750267028809] [G loss: 1.932855486869812] [D real: 0.7552076578140259] [D fake: 0.29690203070640564]

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.mfbz.cn/a/591418.html

如若内容造成侵权/违法违规/事实不符,请联系我们进行投诉反馈qq邮箱809451989@qq.com,一经查实,立即删除!

相关文章

【数据库原理及应用】期末复习汇总高校期末真题试卷

试卷 一、填空题 1.________是位于用户与操作系统之间的一层数据管理软件。 2.数据库系统的三级模式结构是指________、________、________。 3.数据库系统的三种数据模型是________ 、________、________。 4.若关系中的某一属性组的值能唯一地标识一个元组,则…

项目管理-项目进度管理3/3

项目管理:每天进步一点点~ 活到老,学到老 ヾ(◍∇◍)ノ゙ 何时学习都不晚,加油 项目进度管理:需掌握 ITTO, 搞懂计算图,问题和解决方案。 项目进度管理6个过程,包括(口…

Qt5.15.2安装Android开发环境。

下载Java 8,不要下Java 20 jdk8 安装跟着默认走就行:C:\Program Files\Java 需要将QtCreator的sdk_definitions.json文件修改一下 “cmdline-tools;latest” 修改为 “cmdline-tools;6.0” 在一个非中文路径,建立一个android-sdk-windows空…

MATLAB 微积分

MATLAB 微积分 MATLAB提供了多种方法来解决微分和积分问题,求解任意程度的微分方程式以及计算极限。最重要的是,您可以轻松求解复杂函数的图,并通过求解原始函数及其导数来检查图上的最大值,最小值和其他文具点。 本章将讨论微…

AD中如何器件带动导线一起旋转

选中器件和导线,右键点击联合,从选中的器件生成联合 点击屏幕右上角的小齿轮(设置按钮),选择下图所示的旋转步进为45度(或其他),器件拖拽设置为Connected Tracks 之后就可以按住空格…

从零开始搭建一个vue项目

从零开始搭建一个vue项目 一、环境准备 1.1 安装node.js 选择合适的LTS版本,然后下载安装,安装地址:https://nodejs.org/en/download 在命令行中查看已安装的node.js版本 node -v v14.14.01.2 切换为淘宝的镜像源 解决国内下载慢的问题,…

【数据结构(邓俊辉)学习笔记】向量06——位图

文章目录 0.概述1.结构2.实现3. 应用3.1 去重3.2 筛法 0.概述 位图(Bitmap)是一种特殊的序列结构,可用以动态地表示由一组(无符号)整数构成的集合。 test() 判断k 是否存在集合S中。set() 将k 加入到集合S中。clear…

免费APP分发平台 - 一个指南和解析

数字化时代的APP分发平台 随着数字化进程的加速免费APP分发平台 - 一个指南和解析,移动应用(APP)市场正迅速扩大。在这个充满竞争的市场中免费APP分发平台 - 一个指南和解析,一个优秀的APP分发平台能够帮助开发者和商家更有效地触…

【matlab基础知识】(三)二维曲线绘制plot

x[-pi:0.0001:pi]; 选择较小步距 ysin(tan(x))-tan(sin(x));plot(x,y) 条件和函数值做一个点乘 x[-2:0.02:2];y1.1*sign(x).*(abs(x)>1.1)x.*(abs(x)<1.1);plot(x,y) 颜色&#xff0c;线形&#xff0c;曲线上的标志 由于0.01cosx波动太小&#xff0c;所以plotyy绘制多…

蓝桥杯练习系统(算法训练)ALGO-949 勇士和地雷阵

资源限制 内存限制&#xff1a;256.0MB C/C时间限制&#xff1a;1.0s Java时间限制&#xff1a;3.0s Python时间限制&#xff1a;5.0s 问题描述 勇士们不小心进入了敌人的地雷阵&#xff08;用n行n列的矩阵表示&#xff0c;*表示某个位置埋有地雷&#xff0c;-表示某个…

可视化大屏C位图:智慧场馆/场所图

Hello&#xff0c;我是大千UI工场&#xff0c;本期可视化大屏的焦点图&#xff08;C位&#xff09;分享将场馆作为焦点图的情形&#xff0c;欢迎友友们关注、评论&#xff0c;如果有订单可私信。 智慧场馆是指通过物联网、大数据、人工智能等技术手段&#xff0c;将传统场馆与…

ctfshow crypto rsa部分题目简单题解

easyrsa1 下载点击打开附件 e 65537 n 1455925529734358105461406532259911790807347616464991065301847 c 69380371057914246192606760686152233225659503366319332065009 题目中给了e,n,c的值。 使用在线网址factordb.com 分解n得到p&#xff0c;q 编写脚本 import gm…

Java项目:基于SSM框架实现的在线医疗服务系统(ssm+B/S架构+源码+数据库+毕业论文+开题报告)

一、项目简介 本项目是一套基于SSM框架实现的在线医疗服务系统 包含&#xff1a;项目源码、数据库脚本等&#xff0c;该项目附带全部源码可作为毕设使用。 项目都经过严格调试&#xff0c;eclipse或者idea 确保可以运行&#xff01; 该系统功能完善、界面美观、操作简单、功能…

为什么 IP 地址通常以 192.168 开头?(精简版)

网络通讯的本质就是收发数据包。如果说收发数据包就跟收发快递一样。IP地址就类似于快递上填的收件地址和发件地址一样&#xff0c;路由器就充当快递员的角色&#xff0c;在这个纷繁复杂的网络世界里找到该由谁来接收这个数据包&#xff0c;所以说&#xff1a;IP地址就像快递里…

Java 获取 Outlook 邮箱的日历事件

Java 获取 Outlook 邮箱的日历事件 1.需求描述2.实现方案3.运行结果 IDE&#xff1a;IntelliJ IDEA 2022.3.3 JDK&#xff1a;1.8.0_351 Outlook&#xff1a;Microsoft Office 2016 1.需求描述 比如现在需要获取 Outlook 邮箱中四月的全部的会议安排&#xff0c;如下图所示 …

从零开始搭建Springboot项目脚手架1:新建项目

1、技术栈 SpringBoot 3.2.5&#xff1a; 2、 新建项目 使用SpringInitializr 选择Lombok、Configuration Processor、Spring Web&#xff0c;同时IDEA也要安装Lombok插件 删除多余的Maven目录、Maven文件&#xff0c;把HELP.md改成README.md。 当然前提是已经安装好Maven和配…

【JVM】Java工具(Arthas,APM,Java Agent,JMX)

Java工具 常见的Java工具有以下几类&#xff1a; 1、诊断类工具&#xff0c;如Arthas、VisualVM等。 2、开发类工具&#xff0c;如Idea、Eclipse。 3、APM应用性能监测工具&#xff0c;如Skywalking、Zipkin等。 4、热部署工具&#xff0c;如Jrebel等。 Arthas中 Java Ag…

[笔试训练](十二)

目录 034:删除公共字符串 035:两个链表的第一个公共节点 036:mari和shiny 034:删除公共字符串 删除公共字符_牛客题霸_牛客网 (nowcoder.com) 题解: 用哈希记录好第二个字符串中的字符&#xff0c;再遍历一遍第一个字符串&#xff0c;只将没有记录的字符加在结果字符串上。…

ASP.NET网络在线考试系统

摘 要 随着计算机技术的发展和互联网时代的到来&#xff0c;人们已经进入了信息时代&#xff0c;也有人称为数字化时代。数在数字化的网络环境下&#xff0c;学生希望得到个性化的满足&#xff0c;根据自己的情况进行学习&#xff0c;同时也希望能够得到科学的评价&#xff0c…

文件API及其操作

这里介绍两类文件操作、三个文件类。包括文件系统操作&#xff08;File类&#xff09;、文件内容操作&#xff08;操作字节流、操作字符流&#xff09; 1.文件类File 1.1.认识File类 &#xff08;1&#xff09;什么是File类呢&#xff1f;其实就是可以操作文件的一个类。通过…
最新文章