HY-WU (Part I): An Extensible Functional Neural Memory Framework and An Instantiation in Text-Guided Image Editing

Tencent HY Team

We propose HY-WU: a scalable framework for on-the-fly conditional generation of low-rank (LoRA) updates. HY-WU synthesizes instance-conditioned adapter weights from hybrid image–instruction representations and injects them into a frozen backbone during the forward pass, producing instance-specific operators without test-time optimization.

Key Features

🧠

Functional Neural Memory

HY-WU introduces a lightweight "neural memory" for AI. It generates conditioned model adapter per request (without finetuning!), enabling instance-level personalization while preserving the base model's general capability.

🏆

Scalable for Large Models

HY-WU remains practical for large foundation models (even at 80B parameters!). With structured parameter tokenization, the method is naturally compatible with large-scale architectures.

🎨

Strong Human Preference

HY-WU achieves high human preference win-rates against open-source models, exceeds strong closed-source baselines, and remains close to the latest Nano-Banana series.

Showcases

Cross-Domain Clothing Fusion

Creative Cosplay and Character Outfit Migration

High-Fidelity Face Identity Transfer

Seamless Outfit Transfer and Virtual Try-on

High-Quality Texture Synthesis

Evaluation

GSB (Human Evaluation)

HY-WU substantially outperforms leading open-source models, and remains competitive with top-tier closed-source commercial systems. While Nano Banana 2 and Nano Banana Pro achieve slightly higher overall scores (52.4% and 53.8%, respectively), the margin remains modest. Given that these commercial systems are likely trained with substantially larger-scale backbones and proprietary data, the modest performance gap suggests that our operator-level conditional adaptation remains effective even under more constrained model scale.

Human Evaluation with other models.

Acknowledgments

We thank the Tencent Hunyuan team for their support. HY-WU is part of the Tencent Hunyuan project.

BibTeX

@misc{wu2026hy-wu,
  author = {Tencent HY Team and Mengxuan Wu and Xuanlei Zhao and Ziqiao Wang and Ruichfeng Feng and Atlas Wang and Qinglin Lu and Kai Wang},
  title = {HY-WU (Part I): An Extensible Functional Neural Memory Framework and An Instantiation in Text-Guided Image Editing},
  year = {2026},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/Tencent-Hunyuan/HY-WU}},
  note = {Preprint}
}