I'm a member of the transformers
team at Hugging Face 🤗, expanding what's possible with text generation (PT/TF/JAX). I have a PhD in ML applied to 5G signal processing 📡 and I've applied ML to several modalities (text, image, graphs, time-based signals) and industries (telecom, construction, software).
Here are a few interesting open projects and publications I've been part of, by industry:
🤖 Software
- Hugging Face 🤗
1.1. 100x faster TensorFlow text generation with XLA (blog post, twitter thread, TensorFlow blog);
1.2. PT ➡️ TF safe weight conversion CLI (twitter thread);
1.3. Assisted Generation -- faster text generation with the aid of a smaller model (blog post, twitter thread);
1.4. ...and many others. A repo with personal notebooks can be found here.
🏗 Construction
- nPlan
1.1. Forecasting delays in construction projects' activities. When modeled as a classification problem, learning arbitrary delay distributions for each input becomes possible (paper);
1.2. Exploring aleatoric vs epistemic uncertainty with Monte Carlo Dropout and ensembles (some code), and estimating their impact on forecasts.
📡 Telecommunications
- Square Kilometer Array
1.1. Accelerating FIR filters using OpenCL, on FPGAs (paper, code); - Positioning (PhD)
2.1. Designing new signals that can be collected from 5G communications, which contain spatial information about the surroundings (paper);
2.2. ML modeling (DNNs, CNNs, LSTMs, and TCNs) can then be used to convert that signal in 2.1. into the receiver's position, while being as accurate, but much more energy efficient, than the GPS (paper, code);
2.3 Using Monte Carlo Dropout to estimate the uncertainty of the position predictions from 2.2. (paper, code).
Feel free to reach out using the contacts on this profile.