Preet Baxi
Work Experience
AI/ML - Applications of LLMs
October 2025 - Present
May 2025 - Present
January 2025 - Present
ConducLng an experimental study on prompt-injection behavior in LLM-based résumé evaluation, examining how manipulative instructions affect model reasoning and ranking in single- and multi-prompt settings.
Built an end-to-end automated trading system that ingests real-time financial headlines, applies NLP-based sen&ment scoring, and runs predictive models to forecast short-term market movements. Integrated API-linked brokers for autonomous order execution. Back tests demonstrated a 20 % improvement in Sharpe ratio and consistent alpha generation across multiple market regimes.
Developed a universal AI-assisted heuristic-generation framework for stochastic inventory optimization, enabling language-model agents to autonomously design and refine decision policies under demand uncertainty, fully adaptable to parameter changes without code modification.
Algorithm Development and Data Science
July 2022 - December 2024
January 2024 - July 2024
May 2019 - June 2022
Devised a unique “Grouped semi-coherent search algorithm” to detect continuous GW signals from high spindown neutron stars using Python, and FFT-based data segmentation techniques, achieving improved (50 %) detection statistic by balancing computational cost through template spacing and parameter optimization.
Architected scalable algorithms using Python, C, machine learning techniques, and parallel computing for data generation, signal detection, and hardware monitoring, achieving 95 % accuracy in pulsar filtering and enhancing detector calibration using data structures, SFT and F Statistic methods, and matched filtering.
Engineered machine learning-based search and diagnostic algorithms using scikit-learn and TensorFlow to analyze over 200+ electronic channels, employing advanced anomaly detection models to isolate and resolve 4 previously undetected noise sources, improving overall system performance by 45 % and reducing error rates.
Designed and validated predictive models simulating analog voltage propagation by optimizing filter configurations and data sampling strategies, resulting in a 60 % improvement in signal processing accuracy and robust noise mitigation across complex datasets to characterize a new Anti-Aliasing chassis instrument.
Formulated and optimized parameter estimation algorithms using Cramer Rao bounds and advanced Monte Carlo techniques to achieve a 97 % accuracy rate in extracting source properties across various detector systems
Accelerated data analysis and visualization speeds by 30 % by designing custom algorithms based on Bayesian Statistical Methods, leading to faster and more reliable research outcomes.