INRFLOW: Flow Matching of INR in ambient space

Machine Learning


Flow matching models have emerged as a powerful method for generating modeling of irregular or unstructured data, such as domains such as images and videos, as well as 3D point clouds and protein structures. These models are generally trained in two stages. First, the data compressor is trained, and in the subsequent training phase, the flow matching generation model is trained in the potential space of the data compressor. This two-stage paradigm sets obstacles for integrating models across the data domain, as handmade compressor architectures are used for a variety of data modalities. To this end, we present InRFlow, a domain and existing approach for directly learning flow matching transformers in the surrounding space. Taking inspiration from INRS, we introduce conditional, independent, point-wise training goals that allow INRFLOW to make continuous predictions in coordinate space. Our empirical results show that InRFLOW effectively handles a variety of data modalities such as images, 3D point clouds, and protein structure data, achieving strong performance in different domains, and outperforms comparable approaches. INRFLOW is a promising step towards a flow matching generation model that resides in domains that can be easily adopted in different data domains.

Figure 1: (a) A high-level overview of InRFlow using image domains as an example. Our model can be interpreted as an encoder decoder model where the decoder independently predicts for each given coordinate value pair.ft. For different data domains, coordinate and value dimensions change, but the model remains the same. (b) Samples generated by InRFlow trained at Imagenet 256×256. (c) 3D point cloud from images generated by training inrflow in Objaverse (Deitke et al., 2023). (d) Protein structures generated by SwissProt-trained InRFlow (Boeckmann et al., 2003). The GT protein structure is depicted in green, whereas the structure generated by InRFlow is depicted in orange.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *