# Kausshik Manojkumar > Software engineer at RTX working on safety-critical flight management systems. Builds retrieval systems, native macOS apps, and ML pipelines. Published at ACM TKDD 2025. ## About Kausshik Manojkumar is a software engineer at RTX (Collins Aerospace) in Cedar Rapids, Iowa. He is one of three engineers selected to define migration strategy for a 2-million-line Ada-to-C++ modernization of flight management software. He graduated from Iowa State University in May 2025 with a Bachelor of Science in Computer Science and a minor in Mathematics with Honors (GPA 3.97). Before RTX, he was a machine learning research assistant under Dr. Yang Li at Iowa State University, where he contributed experimental results to an ACM TKDD 2025 publication on graph-based time-series forecasting with Graph Sequence Attention. He is interested in problems at the intersection of systems programming, AI/ML, and native application development. He cares about software that is correct, fast, and maintainable. ## Contact - Email: kausshikmanojkumar@gmail.com - GitHub: https://github.com/KAUSSHIK - LinkedIn: https://linkedin.com/in/kausshikm - Website: https://kausshik.dev - Resume: https://kausshik.dev/Kausshik_Manojkumar_Resume.pdf ## Work Experience ### Software Engineer I — RTX (Collins Aerospace) (July 2025 — Present) Cedar Rapids, IA. Safety-critical flight management systems. - Selected as 1 of 3 engineers to define migration strategy for a 2M+ LOC Ada-to-C++ modernization of flight management systems. - Performed dependency analysis across 12+ behaviors and authored technical documentation defining sequencing, parallelization, and phased migration — now used as reference by implementation teams. - Reimplemented core legacy flight-plan functionality in modern C++, reducing code complexity and improving maintainability. - Co-developed an internal PyTest framework with new preconditions and verification paths, reducing manual testing effort by ~40%. ### Machine Learning Research Assistant — Iowa State University (May 2024 — April 2025) Under Dr. Yang Li. - Drove experimental expansion for a graph-based time-series forecasting system across NYC Cab, energy load (ECL), and road traffic datasets. - Built end-to-end experiment pipelines: preprocessing, graph inference via Gaussian Markov Random Fields, multi-horizon forecasting, and evaluation. - Contributed benchmark results to an ACM TKDD 2025 publication demonstrating state-of-the-art accuracy. - Benchmarked Graph Sequence Attention vs. standard attention mechanisms. ## Projects ### QuillReader — Native macOS PDF reader (In Development) - URL: https://kausshik.dev/project/quillreader - GitHub: https://github.com/KAUSSHIK/QuillReader - Tech: Swift, SwiftUI, PDFKit, CryptoKit, Keychain, SSE streaming - Privacy-first PDF reader with AI-powered annotations. Users control what data leaves their machine via a privacy dial (none / local / metadata / full-text). Non-destructive annotations persisted as sidecar JSON files keyed by SHA-256 document fingerprint. BYOAI provider protocol — no vendor lock-in. ### UniLibrary RAG System — Production retrieval-augmented generation system (Shipped) - URL: https://kausshik.dev/project/unilibrary-rag - GitHub: https://github.com/KAUSSHIK/unilibrary-rag - Tech: Python, LangChain, Qdrant, n8n, GCP Cloud Run, Next.js, React, TypeScript - Semantic search engine over 1,000+ academic PDFs for a university research archive. Two-stage pipeline: hybrid retrieval (dense vectors + metadata filters) followed by an LLM validation pass that catches hallucinated citations before responses reach users. Deployed to 3 departments. 95%+ citation accuracy, sub-2s latency. ### Time-Series Forecasting with Transformers (Research) - URL: https://kausshik.dev/project/stock-transformer - GitHub: https://github.com/KAUSSHIK/StockTransformer - Tech: Python, PyTorch, NumPy, Pandas, Matplotlib, scikit-learn - Custom Transformer architecture adapted for financial time-series. Sliding-window training to prevent information leakage. Rigorous baseline comparison against ARIMA and LSTM. Transformer outperformed ARIMA by 18% and LSTM by 7% on multi-step horizons (MSE). Identified failure mode: Transformer underperforms LSTM during high-volatility regime changes. ## Skills - Languages: C++, C, Python, Swift, Java, TypeScript, SQL, Bash - ML / AI: RAG pipelines, Transformers, time-series modeling, embeddings, fine-tuning, model evaluation - Backend & Systems: Spring Boot, FastAPI, Flask, REST APIs, WebSockets, Linux, large codebases - Data & Infrastructure: MySQL, DynamoDB, Qdrant, AWS (EC2, S3, RDS), GCP, Docker, Git - Frameworks: SwiftUI, PDFKit, React, Next.js, LangChain, PyTorch, n8n ## Education - Iowa State University — B.S. Computer Science, Minor in Mathematics with Honors. GPA 3.97. May 2025. - Stanford University Online — Machine Learning Specialization. January 2024. ## Publications - ACM TKDD 2025 — Graph-based time-series forecasting with Graph Sequence Attention (acknowledged contributor).