Large-scale network design problems, such as continental fiber-optic routing or global logistics, are computationally intractable for exact solvers and require spatial decomposition to scale. However, independently solving localized sub-problems often leads to severe violations of global constraints, such as boundary connectivity and total cost budgets. While constrained deep learning offers a mechanism to enforce these global requirements via penalty terms, standard min-max optimization frequently causes severe oscillations at the sub-problem boundaries, preventing convergence to a feasible global topology.
Approach
We propose a self-supervised primal-dual learning architecture that integrates spatial graph partitioning with control-theoretic multiplier updates. Following the spatial grid decomposition strategy of [DualOpt: A Dual Divide-and-Optimize Algorithm for the Large-scale Traveling Salesman Problem](/paper/art_b6ba8c3d1e354258b4901e1bfa009111), a primal neural network predicts local routing decisions within independent sub-grids. To enforce global connectivity and boundary matching, we formulate the stitching requirements as differentiable constraints. Instead of static penalties, we employ the PI framework from [On PI Controllers for Updating Lagrange Multipliers in Constrained Optimization](/paper/art_0854456fc63f46ee9662a5a3c845fa21) to dynamically update instance-specific Lagrange multipliers, using an exponential moving average on boundary violation signals to dampen the oscillations typically seen when merging local sub-routes.
Experimental Plan
We will evaluate our method on large-scale instances of the Traveling Salesman Problem (TSP-10000) and the minimum Steiner Tree Problem using real-world geographic datasets like the National Science Foundation Network (NSFNet) and OpenStreetMap road networks. The primary hypothesis is that PI-controlled Lagrangian updates will achieve a lower optimality gap and faster convergence to feasible global routes compared to standard gradient ascent on dual variables. Baselines will include traditional exact solvers (Gurobi, with a time limit), standard Augmented Lagrangian Methods as used in [Self-Supervised Primal-Dual Learning for Constrained Optimization](/paper/art_42df3475397240e680d67542f4bcbd3b), and heuristic boundary-stitching methods. Metrics will include the global objective cost, the percentage of boundary constraint violations prior to post-processing, and total inference time.