WW-PGD: Projected Gradient Descent optimizer
Announcing: ๐ช๐ช-๐ฃ๐๐ โ ๐ช๐ฒ๐ถ๐ด๐ต๐๐ช๐ฎ๐๐ฐ๐ต๐ฒ๐ฟ ๐ฃ๐ฟ๐ผ๐ท๐ฒ๐ฐ๐๐ฒ๐ฑ ๐๐ฟ๐ฎ๐ฑ๐ถ๐ฒ๐ป๐ ๐๐ฒ๐๐ฐ๐ฒ๐ป๐ I just released WW-PGD, a small PyTorch add-on that wraps standard optimizers (SGD, Adam, AdamW, etc.) and applies an epoch-boundary spectral projection using WeightWatcher diagnostics. Elevator pitch: WW-PGD explicitly nudges each layer toward the Exact Renormalization Group (ERG) critical manifold during training. ๐ง๐ต๐ฒ๐ผ๐ฟ๐ ๐ถ๐ป ๐๐ต๐ผ๐ฟ๐ โข HTSR critical condition: ฮฑ โ 2 โข SETOL ERG condition: trace-log(ฮป) over the spectral tail = 0 WW-PGD makes these explicit optimization targets, rather than […]