- Rethinking Backdoor Attacks on Dataset Distillation: A Kernel Method Perspective Dataset distillation offers a potential means to enhance data efficiency in deep learning. Recent studies have shown its ability to counteract backdoor risks present in original training samples. In this study, we delve into the theoretical aspects of backdoor attacks and dataset distillation based on kernel methods. We introduce two new theory-driven trigger pattern generation methods specialized for dataset distillation. Following a comprehensive set of analyses and experiments, we show that our optimization-based trigger design framework informs effective backdoor attacks on dataset distillation. Notably, datasets poisoned by our designed trigger prove resilient against conventional backdoor attack detection and mitigation methods. Our empirical results validate that the triggers developed using our approaches are proficient at executing resilient backdoor attacks. 6 authors · Nov 28, 2023
- SOCIA: Joint Structure-Parameter Co-Optimization for Automated Simulator Construction Building credible simulators from data is difficult because structure design, parameter calibration, and out-of-distribution (OOD) robustness are tightly coupled. We introduce SOCIA (Simulation Orchestration for Computational Intelligence with Agents), a framework that treats simulator construction as joint structure-parameter co-optimization: it elicits mechanism-rich blueprints, exposes explicit tunable parameters, and instantiates a calibration schema, producing an executable simulator with built-in calibration hooks. SOCIA couples Bayesian Optimization for sample-efficient point calibration with Simulation-Based Inference for uncertainty-aware fitting; diagnostics trigger targeted structural edits in an outer refinement loop to co-optimize design and parameters under tight budgets. Across three diverse tasks, SOCIA consistently outperforms strong baselines, excelling on both in-distribution (ID) fitting and OOD shift. Ablations that weaken structure, calibration design, or tuning yield near-monotone degradations, underscoring the necessity of unified structure-parameter optimization. We will release the code soon. 7 authors · May 17