Skip to content

[continuation of #186] Fully convolutional networks for 2D segmentation #200

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 95 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
95 commits
Select commit Hold shift + click to select a range
212a8cb
first commit
StephanieLarocque May 1, 2017
46f6a35
images for fcn and unet
StephanieLarocque May 1, 2017
3cc70cd
small changes
StephanieLarocque May 1, 2017
6fed95d
fixed details
adri-romsor May 1, 2017
ae93f3a
polyps dataset explanation + metrics
StephanieLarocque May 1, 2017
ae295bd
jaccard visualisation
StephanieLarocque May 1, 2017
1433d2e
code for fcn8, miss dataset_loaders
StephanieLarocque May 1, 2017
de2d672
small changes
StephanieLarocque May 1, 2017
5a2bf99
small changes
StephanieLarocque May 1, 2017
04a030d
old build instructions
StephanieLarocque May 1, 2017
f17ad4a
fixed fcn8
adri-romsor May 1, 2017
c987621
cortical
StephanieLarocque May 1, 2017
43207d7
ray
StephanieLarocque May 1, 2017
1766b4e
remove file
StephanieLarocque May 1, 2017
022d3c0
remove dependance from metrics.py
StephanieLarocque May 1, 2017
a430525
remove dependance from model_helpers.py
StephanieLarocque May 1, 2017
350cff2
big brain images
StephanieLarocque May 1, 2017
e6e82c7
added reference captions
StephanieLarocque May 1, 2017
fe0ea46
global dataset loader from fvsin
StephanieLarocque May 1, 2017
604025b
first commit for cortical layers segmentation
StephanieLarocque May 1, 2017
fa32fcb
first commit
StephanieLarocque May 1, 2017
f76c5f8
cortical layers
May 2, 2017
f986af8
cortical layers updated
StephanieLarocque May 2, 2017
1e29a68
cortical layers imagse
StephanieLarocque May 2, 2017
694c612
Feedback comments from PascalLamblin
StephanieLarocque May 2, 2017
eeedbd9
fixing small details on the metrics
adri-romsor May 2, 2017
138ac61
dataset explanation addeed
StephanieLarocque May 2, 2017
0d2e40f
small changes
StephanieLarocque May 2, 2017
1e86c47
reviewed 1d segm
adri-romsor May 2, 2017
f7267b9
Delete fcn_1D_segm.txt
StephanieLarocque May 2, 2017
37165c9
fix for FCN32-16-8 description
StephanieLarocque May 2, 2017
4fa0e2c
FCN description fixed
StephanieLarocque May 2, 2017
548d011
typo
StephanieLarocque May 2, 2017
0c02bde
fix figure labelling
StephanieLarocque May 2, 2017
0cb25a5
explanation ground truth vs predicted segmentation
StephanieLarocque May 2, 2017
4aef44e
figure 4 description
StephanieLarocque May 2, 2017
1ec060c
deleted files
StephanieLarocque May 2, 2017
635c16c
first commit for fcn1D segmentations
StephanieLarocque May 2, 2017
deec646
edited text, including 2D-> 3D changes
May 3, 2017
dc9f370
lasagne recipe unet implementation
May 3, 2017
18a5dd1
unet update
May 3, 2017
c8d3ed4
added code
May 3, 2017
62bf8b0
changes in the prerequisite
May 3, 2017
22258be
small corrections
adri-romsor May 3, 2017
43ed78e
references fixed
May 3, 2017
e6055ce
acknowledgements
May 3, 2017
4f37df3
fix website links
May 3, 2017
09bbbd2
polyps results image
May 3, 2017
683a233
no change
May 3, 2017
e364e6f
update from pascal comment
May 3, 2017
b218cfb
small changes
May 4, 2017
d515c79
delete unet.py (previous version)
StephanieLarocque May 4, 2017
fc5858d
small change
StephanieLarocque May 4, 2017
c409e17
cleaned dataset loader
StephanieLarocque May 4, 2017
a38467b
relative paths
StephanieLarocque May 4, 2017
88f5441
cleaned dataset loader
StephanieLarocque May 4, 2017
873e9d4
deleted file
StephanieLarocque May 4, 2017
3ee4686
relative paths
StephanieLarocque May 4, 2017
2f92d33
with BN
StephanieLarocque May 4, 2017
762da4b
small changes
StephanieLarocque May 4, 2017
2026c59
Rename code/Unet_lasagne_recipes.py to code/unet/Unet_lasagne_recipes.py
StephanieLarocque May 4, 2017
8285a9f
first commit
StephanieLarocque May 4, 2017
d63ae5f
dataset loader for unet and fcn8
StephanieLarocque May 4, 2017
ce89838
fixed path for unet.py
StephanieLarocque May 4, 2017
cac2d01
adde training loop link
StephanieLarocque May 4, 2017
c07ef34
fix dataset loader for em, name for polyps
StephanieLarocque May 4, 2017
a6e6bf1
fix import
StephanieLarocque May 4, 2017
0558598
small change
StephanieLarocque May 4, 2017
70495d8
small change
StephanieLarocque May 4, 2017
3f8dd64
fix input dim and details
StephanieLarocque May 4, 2017
e3a4ebf
import conv2DDDNlayer
StephanieLarocque May 4, 2017
44ff025
saving stuff
StephanieLarocque May 4, 2017
bbdf0e5
small changes
StephanieLarocque May 4, 2017
c1a2b19
accuracy import fix
StephanieLarocque May 4, 2017
1adcc24
small change
StephanieLarocque May 4, 2017
19068d4
load data from train file
StephanieLarocque May 4, 2017
f1972a4
data link
StephanieLarocque May 4, 2017
db56146
removed files .pyc
May 5, 2017
d731882
index updated with new links
May 5, 2017
ca52c8a
ref for dataset loaders
May 5, 2017
119d10e
remove loading pretrained weights
May 5, 2017
5b05fab
update in loading data parameters
May 5, 2017
6f867c6
added requirements for lasagne, datasetloaders and simple ITK
May 6, 2017
70c213a
specify that data augmentation is used
May 6, 2017
15efc40
data augmentation fixed
May 6, 2017
739870e
fix in shared_path for config.ini
May 6, 2017
ac54712
More small fixes
lamblin May 9, 2017
ef57f6b
Fix warnings in doc generation
lamblin May 9, 2017
5a62c66
Remove files that are not used any more
lamblin May 10, 2017
2a9af6f
Formatting
lamblin May 10, 2017
e0d53ba
Changes to help unet run
lamblin May 23, 2017
28fa556
added per class jaccard
adri-romsor May 23, 2017
86d6e16
Results for unet
lamblin May 24, 2017
40819f9
Preload data for 1D cortical dataset
lamblin May 25, 2017
72881fb
Update segmentation tutorials and .gitignore.
notoraptor Dec 14, 2017
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
.idea
code/*.pyc
code/*_plots
code/tmp*
Expand All @@ -13,3 +14,5 @@ html
*.pyc
*~
*.swp
# This directory may be created by scripts from segmentation tutorials.
save_models
2 changes: 1 addition & 1 deletion README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -37,4 +37,4 @@ Subdirectories:
Build instructions
------------------

To build the html version of the tutorials, install sphinx and run doc/Makefile
To build the html version of the tutorials, run python doc/scripts/docgen.py
Empty file.
185 changes: 185 additions & 0 deletions code/cnn_1D_segm/data_loader/cortical_layers.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,185 @@
import os
import time

import numpy as np
from PIL import Image
import re
import warnings

from dataset_loaders.parallel_loader import ThreadedDataset
from parallel_loader_1D import ThreadedDataset_1D

floatX = 'float32'

class Cortical6LayersDataset(ThreadedDataset_1D):
'''The Cortical Layers Dataset.
Parameters
----------
which_set: string
A string in ['train', 'val', 'valid', 'test'], corresponding to
the set to be returned.
split: float
A float indicating the dataset split between training and validation.
For example, if split=0.85, 85\% of the images will be used for training,
whereas 15\% will be used for validation.
'''
name = 'cortical_layers'

non_void_nclasses = 7
GTclasses = [0, 1, 2, 3, 4, 5, 6]
_cmap = {
0: (128, 128, 128), # padding
1: (128, 0, 0), # layer 1
2: (128, 64, ), # layer 2
3: (128, 64, 128), # layer 3
4: (0, 0, 128), # layer 4
5: (0, 0, 64), # layer 5
6: (64, 64, 128), # layer 6
}
_mask_labels = {0: 'padding', 1: 'layers1', 2: 'layer2', 3: 'layer3',
4: 'layer4', 5: 'layer5', 6: 'layer6'}
_void_labels = []


_filenames = None

@property
def filenames(self):

if self._filenames is None:
# Load filenames
nfiles = sum(1 for line in open(self.mask_path))
filenames = range(nfiles)
np.random.seed(1609)
np.random.shuffle(filenames)

if self.which_set == 'train':
filenames = filenames[:int(nfiles*self.split)]
elif self.which_set == 'val':
filenames = filenames[-(nfiles - int(nfiles*self.split)):]

# Save the filenames list
self._filenames = filenames

return self._filenames

def __init__(self,
which_set="train",
split=0.85,
shuffle_at_each_epoch = True,
smooth_or_raw = 'both',
*args, **kwargs):

self.task = 'segmentation'

self.n_layers = 6
n_layers_path = str(self.n_layers)+"layers_segmentation"

self.which_set = "val" if which_set == "valid" else which_set
if self.which_set not in ("train", "val", 'test'):
raise ValueError("Unknown argument to which_set %s" %
self.which_set)

self.split = split

self.image_path_raw = os.path.join(self.path,n_layers_path,"training_raw.txt")
self.image_path_smooth = os.path.join(self.path,n_layers_path, "training_geo.txt")
self.mask_path = os.path.join(self.path,n_layers_path, "training_cls.txt")
self.regions_path = os.path.join(self.path, n_layers_path, "training_regions.txt")

self.smooth_raw_both = smooth_or_raw

if smooth_or_raw == 'both':
self.data_shape = (200,2)
else :
self.data_shape = (200,1)

super(Cortical6LayersDataset, self).__init__(*args, **kwargs)

def get_names(self):
"""Return a dict of names, per prefix/subset."""

return {'default': self.filenames}



def test_6layers():
train_iter = Cortical6LayersDataset(
which_set='train',
smooth_or_raw = 'both',
batch_size=500,
data_augm_kwargs={},
return_one_hot=False,
return_01c=False,
return_list=True,
use_threads=False)

valid_iter = Cortical6LayersDataset(
which_set='valid',
smooth_or_raw = 'smooth',
batch_size=500,
data_augm_kwargs={},
return_one_hot=False,
return_01c=False,
return_list=True,
use_threads=False)

valid_iter2 = Cortical6LayersDataset(
which_set='valid',
smooth_or_raw = 'raw',
batch_size=500,
data_augm_kwargs={},
return_one_hot=False,
return_01c=False,
return_list=True,
use_threads=False)



train_nsamples = train_iter.nsamples
train_nbatches = train_iter.nbatches
valid_nbatches = valid_iter.nbatches
valid_nbatches2 = valid_iter2.nbatches



# Simulate training
max_epochs = 1
print "Simulate training for", str(max_epochs), "epochs"
start_training = time.time()
for epoch in range(max_epochs):
print "Epoch #", str(epoch)

start_epoch = time.time()

print "Iterate on the training set", train_nbatches, "minibatches"
for mb in range(train_nbatches):
start_batch = time.time()
batch = train_iter.next()
if mb%5 ==0:
print("Minibatch train {}: {} sec".format(mb, (time.time() -
start_batch)))

print "Iterate on the validation set", valid_nbatches, "minibatches"
for mb in range(valid_nbatches):
start_batch = time.time()
batch = valid_iter.next()
if mb%5 ==0:
print("Minibatch valid {}: {} sec".format(mb, (time.time() -
start_batch)))

print "Iterate on the validation set (second time)", valid_nbatches2, "minibatches"
for mb in range(valid_nbatches2):
start_batch = time.time()
batch = valid_iter2.next()
if mb%5==0:
print("Minibatch valid {}: {} sec".format(mb, (time.time() -
start_batch)))

print("Epoch time: %s" % str(time.time() - start_epoch))
print("Training time: %s" % str(time.time() - start_training))

if __name__ == '__main__':
print "Loading the dataset 1 batch at a time"
test_6layers()
print "Success!"
Loading