LLM Research
updated
Is Multilingual LLM Watermarking Truly Multilingual? A Simple
Back-Translation Solution
Paper
• 2510.18019
• Published
• 18
PORTool: Tool-Use LLM Training with Rewarded Tree
Paper
• 2510.26020
• Published
• 5
POWSM: A Phonetic Open Whisper-Style Speech Foundation Model
Paper
• 2510.24992
• Published
• 4
Ming-Flash-Omni: A Sparse, Unified Architecture for Multimodal
Perception and Generation
Paper
• 2510.24821
• Published
• 41
Generalization or Memorization: Dynamic Decoding for Mode Steering
Paper
• 2510.22099
• Published
• 4
Omni-Reward: Towards Generalist Omni-Modal Reward Modeling with
Free-Form Preferences
Paper
• 2510.23451
• Published
• 28
ARC-Encoder: learning compressed text representations for large language
models
Paper
• 2510.20535
• Published
• 8
Continuous Autoregressive Language Models
Paper
• 2510.27688
• Published
• 73
Can Visual Input Be Compressed? A Visual Token Compression Benchmark for
Large Multimodal Models
Paper
• 2511.02650
• Published
• 10
RADLADS: Rapid Attention Distillation to Linear Attention Decoders at
Scale
Paper
• 2505.03005
• Published
• 36
Titans: Learning to Memorize at Test Time
Paper
• 2501.00663
• Published
• 29