Projects

latest | popular

Filter by
Smooth Adversarial Training
ReLU activation function significantly weakens adversarial training due to its non-smooth nature. Hence we propose smooth adversarial training (SAT).
adversarial-training adversarial-learning relu sat
EvoNorms: Evolving Normalization-Activation Layers
We use evolution to design new layers called EvoNorms, which outperform BatchNorm-ReLU on many tasks.
automl normalization activations activation-layers
Implicit Neural Representations with Periodic Activation Function
Leverage periodic activation functions for implicit neural representations & demonstrate that these networks, dubbed sinusoidal representation networks or ...
siren activation-functions tanh relu
Separating Sources of Randomness in NN at Initialization Time
Neurons can have very different distributions when they are computed over weight randomness compared to when they are computed over sample randomness. In a ...
weights-initialization randomness neural-networks kaiming-initialization
projects 1 - 4 of 4
Topic experts
Share your project
Discover, build and share what you've made with ML.
Share someone else's project
Share something interesting you found that's made with ML.