Experiment 3.4

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 5

EXPERIMENT – 3.

Mapped Course Outcome

CO1: Identify and describe soft computing techniques and their roles in building intelligent
machines.

AIM:
Write a program to implement Generative Adversarial Networks (GANs) using genetic
algorithms.

Theory
Generative Adversarial Networks (GANs) are a class of machine learning frameworks
designed by Ian Goodfellow and his colleagues in 2014. GANs consist of two neural
networks: a generator and a discriminator. The generator creates fake data, while the
discriminator evaluates the authenticity of the data. Genetic algorithms (GAs) are
optimization techniques inspired by the process of natural selection, which can be used to
optimize the GANs' parameters by evolving a population of candidate solutions.

Procedure:

Step 1: Setup and Installation

1. Install Anaconda:
o Follow the same installation steps as provided in EXPERIMENT – 1.1.

2. Install Required Libraries:


o Open Anaconda Navigator.
o Ensure that tensorflow, keras, numpy, matplotlib, and deap (Distributed
Evolutionary Algorithms in Python) are installed.

sh
Copy code
conda install tensorflow keras numpy matplotlib
pip install deap

Step 2: Implementing GANs using Genetic Algorithms

1. Import Necessary Libraries:

python
Copy code
import numpy as np
import tensorflow as tf
from tensorflow.keras.layers import Dense, Reshape, Flatten, Dropout,
BatchNormalization, LeakyReLU
from tensorflow.keras.models import Sequential
from deap import base, creator, tools, algorithms
import random
import matplotlib.pyplot as plt

2. Define GAN Components:


python
Copy code
def build_generator(latent_dim):
model = Sequential([
Dense(256, input_dim=latent_dim),
LeakyReLU(alpha=0.2),
BatchNormalization(momentum=0.8),
Dense(512),
LeakyReLU(alpha=0.2),
BatchNormalization(momentum=0.8),
Dense(1024),
LeakyReLU(alpha=0.2),
BatchNormalization(momentum=0.8),
Dense(28 * 28 * 1, activation='tanh'),
Reshape((28, 28, 1))
])
return model

def build_discriminator(img_shape):
model = Sequential([
Flatten(input_shape=img_shape),
Dense(512),
LeakyReLU(alpha=0.2),
Dropout(0.4),
Dense(256),
LeakyReLU(alpha=0.2),
Dropout(0.4),
Dense(1, activation='sigmoid')
])
return model

3. Load and Preprocess the Dataset:

python
Copy code
(x_train, _), (_, _) = tf.keras.datasets.mnist.load_data()
x_train = (x_train.astype(np.float32) - 127.5) / 127.5 # Normalize
to [-1, 1]
x_train = np.expand_dims(x_train, axis=3)

4. Build the GAN Model:

python
Copy code
latent_dim = 100
img_shape = (28, 28, 1)

generator = build_generator(latent_dim)
discriminator = build_discriminator(img_shape)
discriminator.compile(loss='binary_crossentropy', optimizer='adam',
metrics=['accuracy'])

z = tf.keras.Input(shape=(latent_dim,))
img = generator(z)
discriminator.trainable = False
validity = discriminator(img)

combined = tf.keras.Model(z, validity)


combined.compile(loss='binary_crossentropy', optimizer='adam')
5. Define Genetic Algorithm Functions:

python
Copy code
def evolve_population(population, toolbox, ngen=10, cxpb=0.5,
mutpb=0.2):
for gen in range(ngen):
offspring = algorithms.varAnd(population, toolbox, cxpb=cxpb,
mutpb=mutpb)
fits = toolbox.map(toolbox.evaluate, offspring)
for fit, ind in zip(fits, offspring):
ind.fitness.values = fit
population = toolbox.select(offspring, k=len(population))
return population

def train_gan(gan, generator, discriminator, data, epochs,


batch_size, population_size=10, generations=10):
latent_dim = generator.input_shape[1]
img_shape = discriminator.input_shape[1:]

creator.create("FitnessMin", base.Fitness, weights=(-1.0,))


creator.create("Individual", list, fitness=creator.FitnessMin)

toolbox = base.Toolbox()
toolbox.register("attr_float", random.uniform, -1.0, 1.0)
toolbox.register("individual", tools.initRepeat,
creator.Individual, toolbox.attr_float, latent_dim)
toolbox.register("population", tools.initRepeat, list,
toolbox.individual)

def evaluate(individual):
noise = np.array(individual).reshape(1, latent_dim)
generated_image = generator.predict(noise)
return discriminator.predict(generated_image),

toolbox.register("evaluate", evaluate)
toolbox.register("mate", tools.cxBlend, alpha=0.5)
toolbox.register("mutate", tools.mutGaussian, mu=0, sigma=0.2,
indpb=0.2)
toolbox.register("select", tools.selTournament, tournsize=3)

population = toolbox.population(n=population_size)

for epoch in range(epochs):


idx = np.random.randint(0, data.shape[0], batch_size)
real_images = data[idx]

noise = np.random.normal(0, 1, (batch_size, latent_dim))


fake_images = generator.predict(noise)

real_labels = np.ones((batch_size, 1))


fake_labels = np.zeros((batch_size, 1))

d_loss_real = discriminator.train_on_batch(real_images,
real_labels)
d_loss_fake = discriminator.train_on_batch(fake_images,
fake_labels)
d_loss = 0.5 * np.add(d_loss_real, d_loss_fake)

noise = np.random.normal(0, 1, (batch_size, latent_dim))


valid_y = np.ones((batch_size, 1))
g_loss = combined.train_on_batch(noise, valid_y)

if epoch % 100 == 0:
print(f"Epoch {epoch} [D loss: {d_loss[0]} | D accuracy:
{100*d_loss[1]}] [G loss: {g_loss}]")

population = evolve_population(population, toolbox,


ngen=generations)

return population

6. Train the GAN:

python
Copy code
epochs = 10000
batch_size = 64
population_size = 20
generations = 5

trained_population = train_gan(combined, generator, discriminator,


x_train, epochs, batch_size, population_size, generations)

7. Visualize the Results:

python
Copy code
def plot_generated_images(generator, examples=10, dim=(1, 10),
figsize=(10, 1)):
noise = np.random.normal(0, 1, (examples,
generator.input_shape[1]))
generated_images = generator.predict(noise)

plt.figure(figsize=figsize)
for i in range(examples):
plt.subplot(dim[0], dim[1], i + 1)
plt.imshow(generated_images[i].reshape(28, 28),
interpolation='nearest', cmap='gray')
plt.axis('off')
plt.tight_layout()
plt.show()

plot_generated_images(generator)

Step 3: Running the Program

1. Open Jupyter Notebook from Anaconda Navigator.


2. Create a new Python 3 notebook.
3. Copy and paste the above code sections into the notebook cells.
4. Execute each cell sequentially to build, train, and visualize the GAN using genetic algorithms.

Video Tutorial

GANs with Genetic Algorithms

Further Reading
Rolon-Mérette, D., Ross, M., Rolon-Mérette, T., & Church, K. (2016). Introduction to
Anaconda and Python: Installation and setup. Python for research in psychology, 16(5), S5-
S11.

Prospective Viva Questions

1. Explain Generative Adversarial Networks and their primary use.


2. Describe the roles of the generator and discriminator in GANs.
3. Discuss how genetic algorithms can be used to optimize GANs.
4. Explain the concept of the fitness function in the context of genetic algorithms.
5. Provide examples of real-world applications where GANs can be effectively used.

You might also like