MyTimeMachine: Personalized Facial Age Transformation

1The University of North Carolina at Chapel Hill, 2The University of Maryland

MyTimeMachine personalizes a pre-trained global aging prior using $\sim$50 personal selfies, allowing age regression (de-aging) and age progression (aging) with high fidelity and identity preservation.

Abstract

Facial aging is a complex process, highly dependent on multiple factors like gender, ethnicity, lifestyle, etc., making it extremely challenging to learn a global aging prior to predict aging for any individual accurately. Existing techniques often produce realistic and plausible aging results, but the re-aged images often do not resemble the person's appearance at the target age and thus need personalization. In many practical applications of virtual aging, e.g. VFX in movies and TV shows, access to a personal photo collection of the user depicting aging in a small time interval (20$\sim$40 years) is often available. However, naive attempts to personalize global aging techniques on personal photo collections often fail. Thus, we propose MyTimeMachine (MyTM), which combines a global aging prior with a personal photo collection (using as few as 50 images) to learn a personalized age transformation. We introduce a novel Adapter Network that combines personalized aging features with global aging features and generates a re-aged image with StyleGAN2. We also introduce three loss functions to personalize the Adapter Network with personalized aging loss, extrapolation regularization, and adaptive w-norm regularization. Our approach can also be extended to videos, achieving high-quality, identity-preserving, and temporally consistent aging effects that resemble actual appearances at target ages, demonstrating its superiority over state-of-the-art approaches.

Video


Motivation: Why Age Transformation?


Method

Given an input face of $\textit{Oprah Winfrey}$ at 70 years old, our adapter re-ages her face to resemble her appearance at 30, while preserving the style of the input image. To achieve personalized re-aging, we collect $\sim$50 images of an individual across different ages and train an adapter network that updates the latent code generated by the global age encoder SAM. Our adapter preserves identity during interpolation when the target age falls within the range of ages seen in the training data, while also extrapolating well to unseen ages.

Animation

Age Regression (De-aging from ~70 years old)

Source ~70 years old
Reference
Ours (30~70)
SAM
Fading

Please try selecting different celebrities by clicking on the thumbnails. Move the slider to adjust the target age.

Age Progression (Aging from ~40 years old)

Source ~40 years old
Reference
Ours (20~40)
SAM
Fading

Please try selecting different celebrities by clicking on the thumbnails. Move the slider to adjust the target age.

Acknowledgements

We thank Noah Frahm for reviewing early drafts and suggesting helpful improvements. We thank Yiran Xu for thoughtful discussions and feedback during the course of this research.

BibTeX

@misc{qi2024mytimemachinepersonalizedfacialage,
      title={MyTimeMachine: Personalized Facial Age Transformation}, 
      author={Luchao Qi and Jiaye Wu and Bang Gong and Annie N. Wang and David W. Jacobs and Roni Sengupta},
      year={2024},
      eprint={2411.14521},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2411.14521}, 
  }