New job, posted less than a week ago!
Job Details
Posted date: Jan 09, 2026
Location: Seattle, WA
Level: Senior
Estimated salary: $163,000
Range: $132,000 - $194,000
Description
Lead projects and cross-functional initiatives within Google, interacting with executive stakeholders from Engineering, Legal, Product teams and more. Be the go-to person for issues related to the area of the business and use the domain knowledge to provide partners with insights and analyses. Work across user-centric issues in search results that prevent user harm, financial fraud, and identity theft such as non-consensual explicit images, exploitative removal practices, doxxing, etc. Deliver leadership and impact as for the broader Trust and Safety Search, Assistant and Geo team. Perform on-call responsibilities on a rotating basis, including weekend coverage. This role works with sensitive content or situations and may be exposed to graphic, controversial, or upsetting topics or content.Trust & Safety team members are tasked with identifying and taking on the biggest problems that challenge the safety and integrity of our products. They use technical know-how, excellent problem-solving skills, user insights, and proactive communication to protect users and our partners from abuse across Google products like Search, Maps, Gmail, and Google Ads. On this team, you're a big-picture thinker and strategic team-player with a passion for doing what’s right. You work globally and cross-functionally with Google engineers and product managers to identify and fight abuse and fraud cases at Google speed - with urgency. And you take pride in knowing that every day you are working hard to promote trust in Google and ensuring the highest levels of user safety.
Trust and Safety is Google’s team of user trust experts working to make the internet a safer place. Search Intelligence Trust and Safety team is a team of analysts focused on the safety of Search ranking results and Search features, working closely with various engineering teams across the Search product area. Our mission is to identify, measure and mitigate user trust risks through automated solutions.
In this role, you will help with the successful launch of High Risk features, acting as a strategic partner to the Product Area launch team, representing the launch to leadership and managing communications, and providing oversight on adequate enforcement and feedback loop processes. You will enable the deployment of defenses to stop abuse, and lead process improvement efforts to improve speed and quality of response. You will identify platform needs/influence enforcement capability design by working with engineering/product partners.At Google we work hard to earn our users’ trust every day. Trust & Safety is Google’s team of abuse fighting and user trust experts working daily to make the internet a safer place. We partner with teams across Google to deliver bold solutions in abuse areas such as malware, spam and account hijacking. A team of Analysts, Policy Specialists, Engineers, and Program Managers, we work to reduce risk and fight abuse across all of Google’s products, protecting our users, advertisers, and publishers across the globe in over 40 languages.
The US base salary range for this full-time position is $132,000-$194,000 + bonus + equity + benefits. Our salary ranges are determined by role, level, and location. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. Your recruiter can share more about the specific salary range for your preferred location during the hiring process.
Please note that the compensation details listed in US role postings reflect the base salary only, and do not include bonus, equity, or benefits. Learn more about benefits at Google.
Qualifications
Minimum qualifications: Bachelor's degree or equivalent practical experience.7 years of experience in data analytics, trust and safety, policy, cybersecurity, or related fields.
Preferred qualifications: Master's degree or PhD in a relevant field. Experience analyzing ML models performance or working on LLM prompting, training or developing LLMs. Experience in SQL, building dashboards, data collection/transformation, visualization, or experience in a scripting/programming language (e.g., Python). Experience with machine learning. Excellent problem-solving and critical thinking skills with attention to detail in an ever-changing environment.
Extended Qualifications
Bachelor's degree or equivalent practical experience.7 years of experience in data analytics, trust and safety, policy, cybersecurity, or related fields.
Check out other jobs at Google.