Magnetic resonance imaging (MRI) and hyperspectral imaging (HSI) provide complementary information for image-guided neurosurgery, combining high-resolution anatomical detail with tissue-specific optical characterization. This work presents a novel multimodal phantom dataset specifically designed for MRI–HSI integration. The phantoms reproduce a three-layer tissue structure comprising white matter, gray matter, tumor, and superficial blood vessels, using agar-based compositions that mimic MRI contrasts of the rat brain while providing consistent hyperspectral signatures. The dataset includes two designs of phantoms with MRI, HSI, RGB-D, and tracking acquisitions, along with pixel-wise labels and corresponding 3D models, comprising 13 phantoms in total. The dataset facilitates the evaluation of registration, segmentation, and classification algorithms, as well as depth estimation, multimodal fusion, and tracking-to-camera calibration procedures. By providing reproducible, labeled multimodal data, these phantoms reduce the need for animal experiments in preclinical imaging research and serve as a versatile benchmark for MRI–HSI integration and other multimodal imaging studies.