Cycles Improve Conditional Generators
PublicDownloadable Content
open in viewerLearning information-dense, low-dimensional latent representations of high-dimensional data is the thesis of deep learning. The inverse problem of learning latent representations is data generation, in which machines learn a mapping from information-dense latent representations to high-dimensional data spaces. Conditional generation extends data generation to account for labelled data by estimating joint distributions of samples and labels. This thesis connects learning meaningful latent representations through compressive and generative algorithms and contains three primary contributions to the improvement and usage of conditional GANs. The first is three novel architectures for conditional data generation which improve on baseline generation quality for a natural image dataset. The second is a novel approach to structure latent representations by learning a paired structured condition space and weakly structured variation space with desirable properties. Third, a novel application of conditional data generation to a chemical sensing task with beneficial leaking augmentations for extremely low-data paradigms (n < 100) demonstrates that conditional data generation improves the testing performance of downstream supervised models.
- Creator
- Contributors
- Degree
- Unit
- Publisher
- Identifier
- etd-20886
- Keyword
- Advisor
- Defense date
- Year
- 2021
- Date created
- 2021-04-30
- Resource type
- Rights statement
- License
- Last modified
- 2021-09-15
Relations
- In Collection:
Items
Items
Thumbnail | Title | Visibility | Embargo Release Date | Actions |
---|---|---|---|---|
Alexander_Moore_Masters_Thesis_Submit.pdf | Public | Download |
Permanent link to this page: https://digital.wpi.edu/show/vq27zr634