background_image
  • IMAGE: Return to Main
  • IMAGE: Show All Jobs


Position Details: Sr Data Engineer - 1250409N

Location: Beaverton, OR
Openings: 2
Job Number:

Share

Description:

If you’re ready to innovate and become part of our Enterprise Data organization, come join us now! You will be part of an organization that is revolutionizing Client technology platforms and architecting a data and analytics landscape that is simplified, modern, flexible. 

As a Data Engineer within the North America Commercial Analytics team, you will be a key member of a growing and passionate group focused on collaborating across business and technology resources to drive forward key products enabling enterprise data & analytics capabilities across Client.

As a Data Engineer within the North America Commercial Analytics team, you will be a key member of a growing and passionate group focused on collaborating across business and technology resources to drive forward key programs and projects building enterprise data & analytics capabilities across Client.

Primary Responsibilities:
• Good understanding and application of modern data processing technology stacks. For example, Snowflake, SQL, Spark, Airflow, and others
• Design and build product features in collaboration with business and IT 
• Design reusable components, frameworks, and code
• Use and contribute to continuous integration pipelines
• Performance/scalability tuning, algorithms and computational complexity
• Develop architecture and design patterns to process and store high volume data sets
• Participate in an Agile / Scrum methodology to deliver high - quality software releases every 2 weeks through Sprints
• Troubleshoot production support issues post - deployment and come up with solutions as required

• 2+ years’ experience in a professional organization collaborating across multiple functions
• Familiarity with Agile project delivery methods
• Experience with AWS components and services (e.g., EMR, S3, and Lambda)
• Experience with Jenkins, Bitbucket/GitHub and scheduling tools like Airflow
Strong programming, Python, shell scripting and SQL
• Good understanding of file formats including JSON, Parquet, Avro, and others
• Experience with data warehouses/RDBMS like Snowflake, Teradata
• Experience with data warehousing, dimensional modeling and ETL development
• Demonstrable ability to quickly learn new tools and technologies 
• Machine learning frameworks & statistical analysis with Python, R or similar
• Exceptional interpersonal and communication skills (written and verbal)
• Passion for data with demonstrated ability to use data to tell a story and influence decision making
• Detail oriented with strong information seeking skills
Competencies:
• Effective Communicator
• Broad Business Process and Systems Understanding
• Analytical and Focused
• Strategic Thinking and Tactical Execution
• Continuous Learner

• A Bachelor's degree in Computer Science, Engineering, Business, Information Technology or related field

  • Airflow
  • CICD

Perform an action:

IMAGE: Apply to Position
mautic is open source marketing automation




Powered by: CATS - Applicant Tracking System