UROP Proceeding 2024-25

Academy of Interdisciplinary Studies Division of Integrative Systems and Design 248 Sea, Sense, and Melody - 3D Generation Algorithms Supervisor: Tristan Camille BRAUD / ISD Student: TAN Joshua / COMP Course: UROP 1100, Summer The Sea, Sense, and Melody is a project that collects user feelings prompt and accumulates it into an abstract artwork of AI-generated fantasy-themed flower collection. The generation of the flowers is done though a pipeline of language model, text-to-image model, and followed by an image-to-3D model. This research project is aimed at exploring methods to improve the generation process of the 3D flowers through different generation parameters and post-processing procedures. Finetuning on the models is also done to adjust the model to handle the intended variation of fantasy-flower style better. Finally, the deployment plan of the project, linking all the generative models, is also presented. Sea, Sense, and Melody - 3D Generation Algorithms Supervisor: Tristan Camille BRAUD / ISD Student: TAN Ziheng / MATH-CS Course: UROP 1100, Summer This report presents the process and findings of applying text-to-image generation and LoRA (Low-Rank Adaptation) training using the Flux.1-dev model within the ComfyUI interface. A systematic investigation into prompt engineering, parameter tuning, and sampler selection was conducted to optimize image generation quality, focusing on the production of detailed flower images for 3D modeling. The study also describes the creation and captioning of a custom dataset, and the subsequent training of a LoRA model to achieve a distinctive visual style. Results demonstrate that careful parameter selection and tailored LoRA training enhance image fidelity and consistency, although further data and training are necessary for improved generalization. VR Metaverse for Education Supervisor: Tristan Camille BRAUD / ISD Student: KATYAYAN Saanvi Ravi / ISD Course: UROP 1100, Spring This project is aimed at creating a virtual reality (VR) prototyping environment that enables direct manipulation of 3D objects within an immersive workspace. Built using Unity’s XR frameworks (OpenXR, XR Interaction Toolkit) and tested on Oculus Quest 3 and Meta XR Simulator, the system implements six core operations: translation, directional/multi-axis rotation/scaling, object erasure, and scene reset. A statedriven architecture coordinates interactions between a central controller, object-specific scripts, and a modal UI, allowing users to modify environments without breaking immersion. Key highlights include a gesture-sensitive transformation system that maps controller orientation to object manipulation logic and a scalable event-driven UI. Current progress demonstrates reliable real-time object control, with positional accuracy. Immediate next steps focus on implementing object duplication and refining the interface through progressive disclosure principles. The system lays groundwork for automating VR environment design tasks, bridging the gap between conceptualization and implementation in VR development workflows.

RkJQdWJsaXNoZXIy NDk5Njg=