MyTimeMachine: Personalized Facial Age Transformation

SIGGRAPH 2025 (Journal Track)

1UNC Chapel Hill 2University of Maryland

MyTimeMachine personalizes a pre-trained global aging prior using $\sim$50 personal selfies, allowing age regression (de-aging) and age progression (aging) with high fidelity and identity preservation.

Abstract

Facial aging is a complex process, highly dependent on multiple factors like gender, ethnicity, lifestyle, etc., making it extremely challenging to learn a global aging prior to predict aging for any individual accurately. Existing techniques often produce realistic and plausible aging results, but the re-aged images often do not resemble the person's appearance at the target age and thus need personalization. In many practical applications of virtual aging, e.g. VFX in movies and TV shows, access to a personal photo collection of the user depicting aging in a small time interval (20$\sim$40 years) is often available. However, naive attempts to personalize global aging techniques on personal photo collections often fail. Thus, we propose MyTimeMachine (MyTM), which combines a global aging prior with a personal photo collection (using as few as 50 images) to learn a personalized age transformation. We introduce a novel Adapter Network that combines personalized aging features with global aging features and generates a re-aged image with StyleGAN2. We also introduce three loss functions to personalize the Adapter Network with personalized aging loss, extrapolation regularization, and adaptive w-norm regularization. Our approach can also be extended to videos, achieving high-quality, identity-preserving, and temporally consistent aging effects that resemble actual appearances at target ages, demonstrating its superiority over state-of-the-art approaches.

Video


Motivation: Why Age Transformation?


Method

Given an input face of $\textit{Oprah Winfrey}$ at 70 years old, our adapter re-ages her face to resemble her appearance at 30, while preserving the style of the input image. To achieve personalized re-aging, we collect $\sim$50 images of an individual across different ages and train an adapter network that updates the latent code generated by the global age encoder SAM. Our adapter preserves identity during interpolation when the target age falls within the range of ages seen in the training data, while also extrapolating well to unseen ages.

Animation

Age Regression (De-aging from ~70 years old)

Source ~70 years old
Reference
Ours (30~70)
SAM
Fading

Please try selecting different celebrities by clicking on the thumbnails. Move the slider to adjust the target age.

Age Progression (Aging from ~40 years old)

Source ~40 years old
Reference
Ours (20~40)
SAM
Fading

Please try selecting different celebrities by clicking on the thumbnails. Move the slider to adjust the target age.

Q&A

  1. Why GAN instead of Diffusion?

    At the time of this research, GAN handles better inversion-editing trade-off than diffusion, thanks to the benefits of well-trained latent space of StyleGAN.
    Empirically, we find that GAN can maintain input image's style (pose/lighting/expression/etc.) while providing good editing power for aging, which diffusion is not good at.
    More specifically, we find that diffusion inversion/editing methods (rf-inversion/rf-solver-edit/etc.) are overfitting to the input image when editing the aging. See appendix for more details.

  2. Why face-swapping instead of direct video-based re-aging?

    At the time of this research, face-swapping is empirically better than direct video-based re-aging methods (e.g. STIT and VideoEditGAN) for aging.
    'Better' here means less flickering / fast inference without pivotal tuning per frame / better quality etc. See pdf for more details.

Acknowledgements

We thank Noah Frahm for reviewing early drafts and suggesting helpful improvements. We thank Yiran Xu for thoughtful discussions and feedback during the course of this research. This research was partially funded by Lenovo Research (Morrisville, NC). We are grateful to the members of the Mobile Technology Innovations Lab for their support and assistance.

BibTeX

@misc{qi2024mytimemachinepersonalizedfacialage,
      title={MyTimeMachine: Personalized Facial Age Transformation}, 
      author={Luchao Qi and Jiaye Wu and Bang Gong and Annie N. Wang and David W. Jacobs and Roni Sengupta},
      year={2024},
      eprint={2411.14521},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2411.14521}, 
  }