In the context of model compression using the student-teacher paradigm, we propose the idea of student-centric learning, where the student is less constrained by the teacher and able to learn on its own. We believe the student should have more flexibility during training. Towards student-centric learning, we propose two approaches: correlation-based learning and self-guided learning. In correlation-based learning, we propose to guide the student with two types of correlations between activations: the correlation between different channels and the correlation between different spatial locations. In self-guided learning, we propose to give the student network the opportunity to learn by itself in the form of additional self-taught neurons. We empirically validate our approaches on benchmark datasets, producing state-of-the-art results. Notably, our approaches can train a smaller and shallower student network with only 5 layers that outperforms a larger and deeper teacher network with 11 layers by nearly 1% on CIFAR-100.