Senior Research Engineer/Scientist - On-Device Transformer Models
About the Team The Future of Computing Research team is an Applied Research team within the Consumer Devices group focused on developing new methods and models to support our vision as we advance forward in our mission of building AGI that benefits all of humanity. About the Role As a Research Engineer/Scientist on the
What this role actually needs.
Senior Research Engineer/Scientist - On-Device Transformer Models at OpenAI in San Francisco. UpJobz keeps this listing high-signal for applicants targeting serious high-tech roles across the United States, Canada, and Mexico. About the Team The Future of Computing Research team is an Applied Research team within the Consumer Devices group focused on developing new methods and models to support our vision as we advance forward in our mission of building AGI that benefits all of humanity. About the Role As a Research Engineer/Scientist on the
Day-to-day expectations
A clear list of the work this role is designed to cover.
- Train and evaluate multimodal SoTA models along axis that are important to our vision for future devices.
- Develop novel architectures that improve model performance when scaling the models themselves is not an option.
- Run through the necessary walls to take nascent research capabilities and turn them into capabilities we can build on top of.
- Have a research background related to developing on-device transformer models.
- Love performance optimization and working with GPU kernel engineers (but you do not need CUDA experience yourself).
- Do rigorous science (rather than vibes based). We need confidence in the experiments we run to move quickly.
What a strong candidate brings
This keeps the job page specific, readable, and easier to match.
Why people would want this job
Benefits help searchers understand whether the role is a real fit before they apply.
- Train and evaluate multimodal SoTA models along axis that are important to our vision for future devices.
- Develop novel architectures that improve model performance when scaling the models themselves is not an option.
- Run through the necessary walls to take nascent research capabilities and turn them into capabilities we can build on top of.
- Have a research background related to developing on-device transformer models.
- Love performance optimization and working with GPU kernel engineers (but you do not need CUDA experience yourself).
- Do rigorous science (rather than vibes based). We need confidence in the experiments we run to move quickly.
Browse similar jobs
Turn this listing into an application plan.
This is the first pass at the premium UpJobz layer: a fast brief that helps serious applicants move with more clarity.
Next moves
- Tailor your resume around ai and machine-learning instead of sending a generic application.
- Use the first two bullets of your application to connect your background directly to senior research engineer/scientist - on-device transformer models is a high-signal hybrid role in san francisco, and it is most realistic for united states residents.
- Open the role quickly if it fits and bookmark three similar jobs before you leave the page.
Interview themes
Watchouts
- $380K - $445K is visible, so calibrate your application around the posted range.
- Use united states residents as part of your positioning so the recruiter does not have to infer it.
- Show concrete examples of succeeding in hybrid environments.
Search intent signals for this listing
Helpful keyword hooks for serious tech searchers and future programmatic job pages.
Ready to move on this role?
This page keeps the application flow simple while giving you enough context to decide quickly and move.