When you think of exams, you probably imagine students scribbling away on paper in complete silence. But at GLA University, we’re doing things differently—starting with turning traditional assessments into real-time coding experiences.
We’re proud to be the first university in the country to adopt codeathons as a primary mode of assessment for Computer Science and allied courses. It’s not just a bold step, but one that reflects the way the tech industry itself is evolving.
Why Codeathons?
Tech companies today don’t care how fast you memorize formulas—they want people who can solve problems, write efficient code, and think on their feet. That’s why we moved away from outdated pen-and-paper exams and embraced hands-on coding assessments that mirror actual industry practices.
“It didn’t feel like an exam—it felt like a real job interview,” says Shreya Sharma, a final-year student.
From Solo to Synergy
We started out with individual codeathons—just one student, one laptop, and three hours of non-stop coding. While this tested pure logic, students felt isolated and missed collaboration. Feedback revealed it was effective but monotonous.
To bring in teamwork, we introduced collaborative codeathons, allowing students to form their own pairs. This helped increase engagement, but also led to some students doing most of the work while others barely contributed.
So, we evolved. Today, pairs are randomly assigned, and they rotate every codeathon. This levels the playing field, prevents dependency, and helps students learn to work with different personalities—just like in real-world jobs.
Building a Secure, Fair Platform
As we scaled, we realized existing platforms had limitations—loopholes that allowed unfair practices, poor monitoring, and limited customization. So, we built our own in-house coding assessment platform.
With features like:
- Randomized pairing
- Integrated proctoring
- Secure environments
Our platform made cheating nearly impossible and allowed us to track genuine performance. Malpractice incidents dropped from 18% to just 3%—a huge win for academic honesty.
Listening, Learning, Improving
Every change we made was backed by student feedback and data. After introducing random pairing in early 2024, student satisfaction rose by 30%. With open-book exams and better platforms in place by mid-2025, overall satisfaction hit 92%.
Malpractice rates dropped drastically, and students reported stronger problem-solving and collaboration skills
“The best part is that GLA listens,” shares Ananya Singh, a third-year student. “They actually take feedback seriously and make changes that help us.”
What’s Next?
We’re not stopping here. The next phase of our codeathon journey is even more exciting:
- First-year integration: Codeathons from day one to build skills early
- Extended durations: 5–6 hours and 24-hour assessments to mimic real hackathons
- Larger teams: Teams of 3–4 students to reflect actual workplace environments
- Detailed feedback: Personalized reports after every assessment to guide student improvement
Shaping Future-Ready Tech Talent
GLA University’s codeathon-based assessment model isn’t just a new way to take exams—it’s a smarter, more relevant way to prepare students for the real tech world. Through continuous evolution, collaboration, and a commitment to fairness, we’ve built something that’s not just innovative, but impactful.
So, the real question is:
Are you ready to code your future?