Advances in System Identification and Stochastic Optimization

dc.contributor.advisorBasu, Amitabh
dc.contributor.committeeMemberSpall, James C.
dc.contributor.committeeMemberNaiman, Daniel Q.
dc.contributor.committeeMemberLu, Fei
dc.creatorWang, Long
dc.creator.orcid0000-0001-6328-8222
dc.date.accessioned2021-06-25T13:10:35Z
dc.date.available2021-06-25T13:10:35Z
dc.date.created2021-05
dc.date.issued2021-03-30
dc.date.submittedMay 2021
dc.date.updated2021-06-25T13:10:35Z
dc.description.abstractThis work studies the framework of systems with subsystems, which has numerous practical applications, including system reliability estimation, sensor networks, and object detection. Consider a stochastic system composed of multiple subsystems, where the outputs are distributed according to many of the most common distributions, such as Gaussian, exponential and multinomial. In Chapter 1, we aim to identify the parameters of the system based on the structural knowledge of the system and the integration of data independently collected from multiple sources. Using the principles of maximum likelihood estimation, we provide the formal conditions for the convergence of the estimates to the true full system and subsystem parameters. The asymptotic normalities for the estimates and their connections to Fisher information matrices are also established, which are useful in providing the asymptotic or finite-sample confidence bounds. The maximum likelihood approach is then connected to general stochastic optimization via the recursive least squares estimation in Chapter 2. For stochastic optimization, we consider minimizing a loss function with only noisy function measurements and propose two general-purpose algorithms. In Chapter 3, the mixed simultaneous perturbation stochastic approximation (MSPSA) is introduced, which is designed for mixed variable (mixture of continuous and discrete variables) problems. The proposed MSPSA bridges the gap of dealing with mixed variables in the SPSA family, and unifies the framework of simultaneous perturbation as both the standard SPSA and discrete SPSA can now be deemed as two special cases of MSPSA. The almost sure convergence and rate of convergence of the MSPSA iterates are also derived. The convergence results reveal that the finite-sample bound of MSPSA is identical to discrete SPSA when the problem contains only discrete variables, and the asymptotic bound of MSPSA has the same order of magnitude as SPSA when the problem contains only continuous variables. In Chapter 4, the complex-step SPSA (CS-SPSA) is introduced, which utilizes the complex-valued perturbations to improve the efficiency of the standard SPSA. We prove that the CS-SPSA iterates converge almost surely to the optimum and achieve an accelerated convergence rate, which is faster than the standard convergence rate in derivative-free stochastic optimization algorithms.
dc.format.mimetypeapplication/pdf
dc.identifier.urihttp://jhir.library.jhu.edu/handle/1774.2/64046
dc.language.isoen_US
dc.publisherJohns Hopkins University
dc.publisher.countryUSA
dc.subjectSystem Identification
dc.subjectStochastic Optimization
dc.subjectSimultaneous Perturbation Stochastic Approximation
dc.subjectMixed Variable
dc.subjectComplex Variable
dc.titleAdvances in System Identification and Stochastic Optimization
dc.typeThesis
dc.type.materialtext
thesis.degree.departmentApplied Mathematics and Statistics
thesis.degree.disciplineApplied Mathematics & Statistics
thesis.degree.grantorJohns Hopkins University
thesis.degree.grantorWhiting School of Engineering
thesis.degree.levelDoctoral
thesis.degree.namePh.D.
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
WANG-DISSERTATION-2021.pdf
Size:
1.98 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
LICENSE.txt
Size:
2.67 KB
Format:
Plain Text
Description: