Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • sacs/decentralizepy
  • mvujas/decentralizepy
  • randl/decentralizepy
3 results
Show changes
Showing
with 1184 additions and 25 deletions
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
...@@ -2,24 +2,23 @@ ...@@ -2,24 +2,23 @@
dataset_package = decentralizepy.datasets.Celeba dataset_package = decentralizepy.datasets.Celeba
dataset_class = Celeba dataset_class = Celeba
model_class = CNN model_class = CNN
n_procs = 96 images_dir = /mnt/nfs/shared/leaf/data/celeba/data/raw/img_align_celeba
images_dir = /home/risharma/leaf/data/celeba/data/raw/img_align_celeba train_dir = /mnt/nfs/shared/leaf/data/celeba/per_user_data/train
train_dir = /home/risharma/leaf/data/celeba/per_user_data/train test_dir = /mnt/nfs/shared/leaf/data/celeba/data/test
test_dir = /home/risharma/leaf/data/celeba/data/test
; python list of fractions below ; python list of fractions below
sizes = sizes =
[OPTIMIZER_PARAMS] [OPTIMIZER_PARAMS]
optimizer_package = torch.optim optimizer_package = torch.optim
optimizer_class = Adam optimizer_class = SGD
lr = 0.001 lr = 0.001
[TRAIN_PARAMS] [TRAIN_PARAMS]
training_package = decentralizepy.training.GradientAccumulator training_package = decentralizepy.training.Training
training_class = GradientAccumulator training_class = Training
rounds = 20 rounds = 4
full_epochs = False full_epochs = False
batch_size = 64 batch_size = 16
shuffle = True shuffle = True
loss_package = torch.nn loss_package = torch.nn
loss_class = CrossEntropyLoss loss_class = CrossEntropyLoss
......
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.