Nikunj Indoriya

/nɪˈkʊndʒ ɪnˈdɔːriə/noun
00:00:00IST
lofi

A computer science undergraduate building research-driven AI systems with strong foundations in mathematics, optimization, and machine learning.

An engineer who bridges theory and real-world deployment to create scalable, intelligent software.

Experience

Student Research Internlink
National Astronomical Observatory of Japan, Tokyo, Japan

Designed and trained attention-based U-Net architectures for gamma-ray burst light curve reconstruction, achieving a 38–41% reduction in reconstruction-induced parameter uncertainty compared to interpolation and baseline neural models, validated on 521 GRBs.

Investigated time-aware Neural ODE models for continuous-time astrophysical modeling, identifying data sparsity and observational noise as key limiting factors and producing negative results that informed model selection and experimental design.

Built end-to-end training pipelines, led hyperparameter optimization experiments, and incorporated uncertainty quantification and ensemble strategies to improve model stability, interpretability, and generalization.

Contributed to two international research publications through rigorous experimentation, analysis, and scientific reporting.

SparQ Summer Intern 2025link
QNu Labs, Bengaluru, India

Designed and deployed a highly available Kubernetes-based infrastructure focused on scalability, fault tolerance, and seamless container orchestration.

Automated deployment workflows using Docker and CI/CD pipelines, improving reliability and efficiency of cloud-native service delivery.

Strengthened production readiness by implementing containerized services, optimizing resource allocation, and enhancing system resilience across distributed environments.

In Between These Experiences

Between Research and Systems

My journey has been less about jumping between roles and more about sharpening direction. I began by exploring machine learning from a theoretical lens, fascinated by models and mathematical structure. Early on, progress meant improving metrics, trying new architectures, and understanding why certain approaches converged while others failed.

Over time, I realized that building meaningful AI systems requires far more than good architectures. Performance alone is fragile without rigor, and elegance in theory does not automatically translate to robustness in practice.

Research at NAOJ taught me discipline. It taught me to question assumptions, to account for uncertainty, and to treat negative results as insight rather than failure. Working with noisy, sparse scientific data forced me to slow down, design careful experiments, and think deeply about what the model was actually learning.

Systems work at QNu Labs pushed me in a different direction. I had to think about scalability, orchestration, and failure tolerance. Reliability mattered. Deployment constraints mattered. Infrastructure decisions shaped outcomes just as much as modeling choices.

Together, these experiences reshaped how I approach engineering. I stopped viewing research and systems as separate tracks and began to see them as interdependent layers of the same problem.

Today, I think in terms of end-to-end systems. From theory to infrastructure. From modeling to production. From experiments to impact.

Education

Indian Institute of Science Education
and Research, Bhopal
Computer Science and Engineering

2022 - Surviving

GitHub Contributions

Research Publications

Gamma-Ray Burst Light-curve Reconstruction: A Comparative Machine and Deep Learning Analysis

The Astrophysical Journal Supplement Series, 2025, 281, 35

Authors: A. Manchanda, M. G. Dainotti, N. Indoriya, et al.

View Publication

Abstract

Gamma-ray bursts (GRBs), observed at high-z, are probes of the evolution of the Universe and can be used as cosmological tools. We mitigate gaps in light curves, including the plateau region, key to building the two-dimensional Dainotti relation. We reconstruct LCs using nine models: MLP, Bi-Mamba, Fourier transform, Gaussian process–random forest hybrid, Bi-LSTM, CGAN, SARIMAX-based Kalman filter, Kolmogorov–Arnold networks, and Attention U-Net, compared over 521 GRBs. MLP and Attention U-Net outperform other methods, with MLP reducing plateau parameter uncertainties by 37.2–41.2% and Attention U-Net achieving the largest uncertainty reduction. These improvements in parameter precision are needed to use GRBs as standard candles and predict GRB redshifts through machine learning.

Multi-Model Framework for Reconstructing Gamma-Ray Burst Light Curves

The Journal of High Energy Astrophysics, 2026, 51

Authors: A. Kaushal, M. G. Dainotti, N. Indoriya, et al.

View Publication

Abstract

Mitigating data gaps in Gamma-ray burst (GRB) light curves is crucial for cosmological research and for improving the precision of parameters. This analysis improves the applicability of the two-dimensional Dainotti relation (Ta–La). The study expands on a previous 521 GRB sample by incorporating seven models: Deep Gaussian Process (DGP), Temporal Convolutional Network (TCN), CNN-BiLSTM, Bayesian Neural Network (BNN), Polynomial Curve Fitting, Isotonic Regression, and Quartic Smoothing Spline (QSS). QSS significantly reduces uncertainty across parameters (43.5% for log Ta, 43.2% for log Fa, 48.3% for α), and CNN-BiLSTM has the lowest outlier rate for α at 0.77%. These models complement traditional methods like Attention U-Net and MLP and support GRBs as cosmological probes and standard candles.

Tech Stack

Technologies powering my work across machine learning research, AI systems, and infrastructure.

Python
C
C++
SQL
JavaScript
HTML
PyTorch
TensorFlow
scikit-learn
Computer Vision
RAG
Vector Databases
Semantic Search
Linux
Bash
Git
Docker
Kubernetes
CI/CD
NumPy
pandas
PyCharm
IntelliJ
Python
C
C++
SQL
JavaScript
HTML
PyTorch
TensorFlow
scikit-learn
Computer Vision
RAG
Vector Databases
Semantic Search
Linux
Bash
Git
Docker
Kubernetes
CI/CD
NumPy
pandas
PyCharm
IntelliJ

Beyond Code

I use LinkedIn as a living record of my journey. Today, it captures milestones and growth moments. Soon, it will expand into a space where I share deeper perspectives on AI architectures, research lessons, and engineering systems that scale.

Library

Dev

Pattern Recognition and Machine LearningChristopher M. Bishop
Deep LearningIan Goodfellow, Yoshua Bengio, Aaron Courville
Artificial Intelligence: A Modern ApproachStuart Russell, Peter Norvig
Computer Systems: A Programmer’s PerspectiveRandal Bryant, David O’Hallaron
Introduction to AlgorithmsThomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, Clifford Stein
The Elements of Statistical LearningTrevor Hastie, Robert Tibshirani, Jerome Friedman

Casual Reads

Sapiens: A Brief History of HumankindYuval Noah Harari
The Silk Roads: A New History of the WorldPeter Frankopan
The Lessons of HistoryWill Durant, Ariel Durant
The Great GatsbyF. Scott Fitzgerald
MeditationsMarcus Aurelius
Guns, Germs, and SteelJared Diamond

And many more. These are a few that left a lasting impact.

Thing about me

Beyond engineering systems and writing code, I’m deeply interested in understanding how things work at their core, whether that’s a learning algorithm, a distributed system, or a historical turning point. Curiosity is not just a professional trait for me; it’s a way of thinking.

I’m drawn to first principles. I enjoy breaking complex ideas down to their foundations and rebuilding them with clarity. That same mindset extends beyond technology into history, philosophy, and literature, some of fields that sharpen perspective and deepen judgment.

I believe the strongest engineers are not only technically capable but intellectually well-rounded. It’s the intersection of analytical rigor and human awareness that shapes meaningful systems.

Get in Touch

Connect with me on LinkedIn or shoot an email

Pomodoro Timer

You've reached the end! Or have you? Before you vanish into the digital void, I've got a quick Pomodoro Timer to help you focus better on your next big thing (or just to remind you to stop doomscrolling).

25:00Focus Session