Profile
I am a recent graduate based in Copenhagen with dual BSc degrees in Mathematics and Computer Science and an MSc in Quantum Information Science. My strengths sit at the intersection of quantitative thinking and engineering execution: statistical modeling (including time series), clear reasoning about uncertainty, and shipping working software (APIs, authentication, deployment).
I am looking for junior roles where I can contribute as a Data Scientist / Quantitative Analyst (non-trading) or in data-heavy engineering, and where learning fast is part of the job.
Experience
- Built a greenfield internal knowledge platform for researchers (2-person team, minimal guidance, ~12h/week).
- Implemented ~80% of the codebase across frontend and backend; focused on reliability and clarity over cleverness.
- Designed authentication flows using JWT access + refresh tokens with secure HTTP-only cookies.
- Developed async APIs with FastAPI and managed relational data (~15 tables) using SQLite.
- Delivered a working prototype deployed on an internal machine for ~50 internal users.
- Participated in an internal applied AI program focused on end-to-end delivery rather than isolated modeling.
- Trained and deployed a computer vision classification model provided via Azure ML to edge devices using Azure agents.
- Learned first-hand what “deployment constraints” actually mean (foreign machines, agents, and operational friction).
- Studied how noise impacts Gibbs states prepared by shallow local quantum circuits (k-local, depth O(log n)).
- Implemented tensor-network simulations in Python using Quimb (systems up to 8 qubits; research-grade experimentation).
- Focused on depolarizing noise (with discussion of amplitude damping) and formulated an original conjecture.
Projects
- Implemented classical time series models (AR, MA, ARIMA, ARIMAX) and explored stepwise model selection.
- Ran diagnostics including residual checks and heteroskedasticity tests to understand when assumptions break.
- Regression: built linear models with L1/L2 regularization and tuned hyperparameters via grid search.
- Unsupervised learning: applied Gaussian Mixture Models to physiological signals (e.g., heart rate, blood pressure) to explore relationships with emotional states.
- Run self-hosted services on a Raspberry Pi using Linux, Docker, and systemd.
- Configured Nginx reverse proxy and HTTPS via Certbot; use Tailscale for secure remote access.
- Debugged real operational issues (e.g., DNS failures when Pi-hole went offline due to missing fallback DNS).
- Deployed a Django site with Gunicorn + Nginx, TLS, static file handling, and sane restart behavior.
- Set up GitHub Actions to keep deployments repeatable and reduce “it works on my machine” drift.
Education
- Relevant focus: time series analysis, statistical modeling, computational methods, and research-grade simulation.
- Thesis: learning local Gibbs states from noisy shallow quantum circuits (tensor networks, Quimb).
- Coursework in probability, statistical inference, linear algebra, algorithms, and stochastic processes.
- BSc thesis: fundamentals of quantum computing; small algorithm simulations in Qiskit.
Skills
Programming
Python (NumPy, Pandas, scikit-learn, Quimb, Qiskit), C++ (STL; algorithmic practice), JavaScript (Vue). SQL basics (joins, group by) + ORMs.
Data & Modeling
Time series (AR/ARIMA/ARIMAX), regression (L1/L2), Gaussian Mixture Models, hypothesis testing, Monte Carlo (European option pricing coursework).
Systems
Linux, Git/GitHub, GitHub Actions, Docker, Nginx, systemd, TLS/Certbot, reverse proxying, basic operations/debugging.
How I Work
I prioritize correctness and maintainability, write things down, and try to make failures loud rather than silent. Comfortable learning in ambiguous environments.