Knitting Skeletons: Computer-Aided Design Tool for Shaping and Patterning of Knitted Garments
UIST 2019 (Paper)
Alexandre Kaspar, Liane Makatura, and Wojciech Matusik
Neural Inverse Knitting: From Images to Manufacturing Instructions
ICML 2019 (Paper)
Alexandre Kaspar, Tae-Hyun Oh, Liane Makatura, Petr Kellnhofer, Jacqueline Aslarus, and Wojciech Matusik
Environment-Scale Fabrication: Replicating Outdoor Climbing Experiences
CHI 2017 (Paper)
Emily Whiting, Nada Ouf, Liane Makatura, Christos Mousas, Zhenyu Shu, and Ladislav Kavan
In Progress: Honors Research Thesis
An exciting exercise in computational fabrication research -- currently under wraps (or sawdust, as the case may be) but many more details coming soon!
Balancing Act: An Interactive Tool for Fabricating Calder-Style Hanging Mobiles
Spring 2016 | Prof. Emily Whiting | CS89: Computational Fabrication
Project Duration: 4 weeks
Teammates: Catherine Most, Gloria Li
This project featured several interesting challenges, as we sought a program that enabled intuitive yet unrestricted creative control for custom mobile design while also guaranteeing a fabricable result matching the specified configuration. I worked with two partners, but was primarily responsible for the proposition of our novel design paradigm. I also implemented the necessary structures, physical calculations, and interactive elements for our prototype, while contributing to the final demos and writeup for presentation.
Deep Forensics: Using CNNs to Differentiate CGI from Photographic Images
Winter 2016 | Prof. Lorenzo Torresani | CS89: Visual Recognition
Project Duration: 5 weeks
Teammate: Shruti Agarwal
For our final project, my partner and I explored the possibility of using deep convolutional neural networks (CNNs) to learn a representation that would be capable of discriminating between photographs and computer generated images directly from the image pixels themselves. We created a custom dataset for this purpose, which we were encouraged to polish for public release into the research community. After obtaining results which averaged 70\% accuracy over several strategically devised (and intentionally challenging) conditions, we confirmed our belief that CNNs could be a powerful and effective tool for this purpose. We are actively refining our experiments and approaches with the hope of obtaining publishable results.