iPipeline Inc

Data Analytics Engineer

Location US-PA-Exton
ID 2024-1148
Category
Research/Development
Position Type
Full Time
Additional Job location
iPipeline- North Carolina
Additional Job location
iPipeline - Florida

Overview

As a global market leader, iPipeline combines technology, innovation, and expertise to deliver ground-breaking, award-winning software solutions that transform the life insurance, financial services, and protection industries. With one of the industry’s largest data sets, we help advisors/advisers and agents to transform paper and manual operations into a secure, seamless digital experience – from proposal to commission– so they can help better secure the financial futures of their clients.

 

At iPipeline, you’ll play a major role in helping us to provide best-in-class, transformative solutions. We’re passionate, creative, and innovative, and together as a team, we continually strive to advance, accelerate, and expand the reach of our technology. We value different perspectives and are committed to creating an environment that embraces diverse backgrounds and fosters inclusion.

 

We’re proud that we’ve been recognized  as a repeat winner of various industry  awards, demonstrating our excellence and highlighting us as a top workplace in both the US and the UK. We believe that the culture we’ve built for our nearly 900 employees around the word is exceptional -- and we’ve created a place where our employees love to come to work, every single day.

 

Come join our team!

 

About iPipeline

Founded in 1995, iPipeline operates as  a business unit of Roper Technologies (Nasdaq:  ROP), a constituent of the Nasdaq 100, S&P 500®, and Fortune 1000® indices. iPipeline is a leading global provider of comprehensive and integrated digital solutions for the life insurance and financial services industries in North America, and life insurance and pensions industries in the UK. We couple one of the most expansive digital and automated platforms with one of the industry’s largest data libraries to accelerate, automate, and simplify various applications, processes, and workflows – from quote to commission – with seamless integration. Our vision is to help everyone achieve lasting financial security by delivering innovative solutions that connect, simplify, and transform the industry.  

 

iPipeline is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to gender, race, color, religious creed, national origin, age, sexual orientation, gender identity, physical or mental disability, and/or protected veteran status. We are committed to building a supportive and inclusive environment for all employees.

 

This is an office-based position.

Responsibilities

We are currently looking for a Data Analytics Engineer to join our team.

iPipeline’s Data Analytics team works with exciting, cutting-edge technologies like Snowflake, AWS, and Sigma and is seeking a Data Analytics Engineer to help grow our data products. The ideal candidate will have experience developing in SQL, Python, and creating data visualizations and dashboards based on customer requirements. 

  • Interact with customers and internal users to gather requirements, engineer data, create visualizations, and iterate on analytic designs.
  • Build curated views with SQL in Snowflake to feed downstream analytics and data science applications.
  • Design powerful and flexible data models in SQL and Power BI to best serve customer needs.
  • Build, deploy, and maintain predictive machine learning models using Python + AWS.
  • Organize the collection, processing, and storage of data and downstream business models.
  • Enable data observability and resource governance in Snowflake.
  • Validate data accuracy against Snowflake database and product SMEs​.
  • Build, update and maintain end-to-end data pipelines with a variety of tools and services from ingestion to datalake and/or data warehouse.

Qualifications

  • SQL and deep understanding of data modeling and ETL/ELT processes.
  • Experience building Star Schema models and dimensional modeling principles.
  • Experience creating and maintaining data pipelines from a data lake to Snowflake.
  • 3 + years’ experience working in a BI tool, preferably Power BI.
  • Experience developing machine learning models in Python.
  • Ability to quickly pick-up new technologies and adapt to a fast-paced, innovative environment.
  • Experience with AWS resources and services.

Desired Qualifications

  • Experience with Snowflake and dbt.
  • Experience with AWS Data Migration Services, Kinesis, RDS, S3, DynamoDB, Lambda and Quicksight.
  • Experience with Sigma .
  • Experience with TOM and Power BI external tools such as Tabular Editor and ALM Toolkit.
  • Experience working in an agile environment and with Jira.
  • Experience with infrastructure as code, preferably Terraform.

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.