<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Publications | 个人主页</title><link>https://lxk-221.github.io/zh/publication/</link><atom:link href="https://lxk-221.github.io/zh/publication/index.xml" rel="self" type="application/rss+xml"/><description>Publications</description><generator>Hugo Blox Builder (https://hugoblox.com)</generator><language>zh-Hans</language><lastBuildDate>Mon, 09 Feb 2026 00:00:00 +0000</lastBuildDate><item><title>MOSAIC: Bridging the Sim-to-Real Gap in Generalist Humanoid Motion Tracking and Teleoperation with Rapid Residual Adaptation</title><link>https://lxk-221.github.io/zh/publication/mosaic/</link><pubDate>Mon, 09 Feb 2026 00:00:00 +0000</pubDate><guid>https://lxk-221.github.io/zh/publication/mosaic/</guid><description>&lt;div class="flex px-4 py-3 mb-6 rounded-md bg-primary-100 dark:bg-primary-900">
&lt;span class="pr-3 pt-1 text-primary-600 dark:text-primary-300">
&lt;svg height="24" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24">&lt;path fill="none" stroke="currentColor" stroke-linecap="round" stroke-linejoin="round" stroke-width="1.5" d="m11.25 11.25l.041-.02a.75.75 0 0 1 1.063.852l-.708 2.836a.75.75 0 0 0 1.063.853l.041-.021M21 12a9 9 0 1 1-18 0a9 9 0 0 1 18 0m-9-3.75h.008v.008H12z"/>&lt;/svg>
&lt;/span>
&lt;span class="dark:text-neutral-300">Released on arXiv:&lt;/span>
&lt;/div>
&lt;p>MOSAIC is an open-source, full-stack system for humanoid motion tracking and teleoperation:&lt;/p>
&lt;ul>
&lt;li>&lt;strong>Learns a general motion tracker&lt;/strong> via RL on multi-source motion bank&lt;/li>
&lt;li>&lt;strong>Performs rapid residual adaptation&lt;/strong> to bridge sim-to-real gap&lt;/li>
&lt;li>&lt;strong>Supports multiple interfaces&lt;/strong> for teleoperation&lt;/li>
&lt;li>&lt;strong>Validated with real-robot experiments&lt;/strong> demonstrating robust offline motion replay and online long-horizon teleoperation&lt;/li>
&lt;/ul>
&lt;p>&lt;strong>Authors&lt;/strong>: Bo-Sheng Huang*, Yibo Peng*, Xukun Li* (&lt;em>Equal contribution&lt;/em>). Corresponding authors: Zhenshan Bing†, Xinlong Wang†.&lt;/p></description></item><item><title>DECO: Decoupled Multimodal Diffusion Transformer for Bimanual Dexterous Manipulation with a Plugin Tactile Adapter</title><link>https://lxk-221.github.io/zh/publication/deco/</link><pubDate>Thu, 05 Feb 2026 00:00:00 +0000</pubDate><guid>https://lxk-221.github.io/zh/publication/deco/</guid><description>&lt;div class="flex px-4 py-3 mb-6 rounded-md bg-primary-100 dark:bg-primary-900">
&lt;span class="pr-3 pt-1 text-primary-600 dark:text-primary-300">
&lt;svg height="24" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24">&lt;path fill="none" stroke="currentColor" stroke-linecap="round" stroke-linejoin="round" stroke-width="1.5" d="m11.25 11.25l.041-.02a.75.75 0 0 1 1.063.852l-.708 2.836a.75.75 0 0 0 1.063.853l.041-.021M21 12a9 9 0 1 1-18 0a9 9 0 0 1 18 0m-9-3.75h.008v.008H12z"/>&lt;/svg>
&lt;/span>
&lt;span class="dark:text-neutral-300">Released on arXiv:&lt;/span>
&lt;/div>
&lt;p>DECO is a decoupled multimodal diffusion transformer for bimanual dexterous manipulation that:&lt;/p>
&lt;ul>
&lt;li>&lt;strong>Disentangles multimodal inputs&lt;/strong> (vision, proprioception, tactile) through specialized conditioning pathways&lt;/li>
&lt;li>&lt;strong>Features a lightweight tactile adapter&lt;/strong> for parameter-efficient injection of tactile signals&lt;/li>
&lt;li>&lt;strong>Achieves 72.25% success rate&lt;/strong> with 21% improvement over baseline&lt;/li>
&lt;li>&lt;strong>Introduces DECO-50 dataset&lt;/strong> with 50 hours of data and over 5M frames&lt;/li>
&lt;/ul></description></item></channel></rss>