Data Engineer, Data Architecture and Engineering
The gSO Data, Architecture, Tools and Analytics (gData) team empowers Google to make brilliant business decisions by delivering critical data infrastructure and actionable insights. Supporting the global gTech Ads organization, gData manages massive datasets to solve complex, non-routine investigative issues. Ultimately, our insights optimize operations and enhance the advertiser experience that drives the majority of Alphabet's business.
Google creates products and services that make the world a better place, and gTech’s role is to help bring them to life. Our teams of trusted advisors support customers globally. Our solutions are rooted in our technical skill, product expertise, and a thorough understanding of our customers’ complex needs. Whether the answer is a bespoke solution to solve a unique problem, or a new tool that can scale across Google, everything we do aims to ensure our customers benefit from the full potential of Google products.
To learn more about gTech, check out our video.
Minimum qualifications:
- Bachelor's degree in Computer Science, Information Systems, a related technical field, or equivalent practical experience.
- 3 years of experience in data engineering, database querying (e.g., SQL), developing data pipelines (ETL) and data visualization (dashboards/reports).
- 3 years of experience coding in one or more programming languages.
- 3 years of experience working with data infrastructure and data models by performing exploratory queries and scripts.
- 3 years of experience designing data pipelines, and dimensional data modeling for synch and asynch system integration and implementation using internal (e.g., Flume, etc.) and external stacks (DataFlow, Spark, etc.).
Preferred qualifications:
- Master's degree in Computer Science, Engineering, Mathematics, a related technical field, or equivalent practical experience.
- 3 years of experience partnering with stakeholders (e.g., users, partners, customer), and managing stakeholders/customers.
- 3 years of experience developing project plans and delivering projects on time within budget and scope.
- Experience using AI technologies to augment, improve or automate the development process.
- Experience writing and maintaining ETLs which operate on a variety of structured and unstructured sources.
- Experience in large-scale distributed data processing, including familiarity with NoSQL databases.