Optimization techniques are widely used to allocate and share limited resources to competing demands in communication and computer infrastructures. The speaker will start by showing the well-known Transport Control Protocol (TCP) as a distributed solution to achieve the optimal allocation of network bandwidth. Unfortunately, factors such as multiple grades of service quality, variable transmission power, and tradeoffs between communication and computation often make the optimization problem for resource allocation non-convex. New distributed solution techniques are needed to solve these problems.
As an illustrative example, the speaker will consider in-network data processing in sensor networks where data are aggregated (fused) along the way they are transferred toward the end user. Finding the optimal solution for the distributed processing problem is NP-hard, but for specific settings, the problem can lead to a distributed framework for achieving the optimal tradeoff between communications and computation costs.
As for the aforementioned problems, gradient-based iterative algorithms are commonly used to solve the optimization problems. Much research focuses on improving the iteration convergence. However, when the system parameters change, it requires a new solution from the iterative methods. The speaker will present a new machine-learning method by using two Coupled Long Short-Term Memory (CLSTM) networks to quickly and robustly produce the optimal or near-optimal solutions to constrained optimization problems over a range of system parameters. Numerical examples for allocation of network resources will be presented to confirm the validity of the proposed method.