Job Description:
Position Description:
Delivers a state-of-the-art analytics platform using SQL, Python, Java, and Extract, Load, Transform (ELT) development. Implements relational and dimension data models to improve platforms. Maintains data using database technologies (Snowflake). Provides long-term solutions for foundational data and analytics platforms. Coordinates workflows using Continuous Integration and Continuous Delivery (CI/CD) methodologies and software version control. Provides business solutions by developing complex software applications.
Primary Responsibilities:
Develops original and creative technical solutions to on-going development efforts.
Designs applications or subsystems on major projects and for/in multiple platforms.
Develops applications for multiple projects supporting several divisional initiatives.
Supports and performs all phases of testing leading to implementation.
Assists in the planning and conducting of user acceptance testing.
Develops comprehensive documentation for multiple applications supporting several corporate initiatives.
Responsible for post-installation testing of any problems.
Establishes project plans for projects of moderate scope.
Works on complex assignments and often multiple phases of a project.
Performs independent and complex technical and functional analysis for multiple projects supporting several initiatives.
Education and Experience:
Bachelor’s degree in Applied Computer Science, Computer Science, Engineering, Information Technology, Information Systems, or a closely related field (or foreign education equivalent) and three (3) years of experience as a Senior Data Engineer (or closely related occupation) developing data warehouse and reporting platforms using SQL and AWS in a financial services environment.
Or, alternatively, Master’s degree in Applied Computer Science, Computer Science, Engineering, Information Technology, Information Systems, or a closely related field (or foreign education equivalent) and one (1) year of experience as a Senior Data Engineer (or closely related occupation) developing data warehouse and reporting platforms using SQL and AWS in a financial services environment.
Skills and Knowledge:
Candidate must also possess:
Demonstrated Expertise (“DE”) building data warehouses and developing Big Data Hadoop solutions using Snowflake, Informatica, Airflow, Control-M, UNIX, Spark SQL, and Python.
DE analyzing, profiling, mining, extracting, and cleansing data and large-scale data warehouses using Snowflake, Postgres, Netezza, AWS, SQL Server, and Oracle.
DE generating visual insights for business and end users by creating data models and data structures using Oracle Business Intelligence Enterprise Edition (OBIEE), BI Publisher, and Tableau.
DE performing data analysis -- data cleaning, preparation, transformation, and modeling data from various data sources onto a data warehouse platform -- to support business needs and decision making, using Snowflake, Oracle, SQL Server, and IBM DB2.
#PE1M2
#LI-DNI
Certifications:
Category:
Information TechnologyMost roles at Fidelity are Hybrid, requiring associates to work onsite every other week (all business days, M-F) in a Fidelity office. This does not apply to Remote or fully Onsite roles.
Please be advised that Fidelity’s business is governed by the provisions of the Securities Exchange Act of 1934, the Investment Advisers Act of 1940, the Investment Company Act of 1940, ERISA, numerous state laws governing securities, investment and retirement-related financial activities and the rules and regulations of numerous self-regulatory organizations, including FINRA, among others. Those laws and regulations may restrict Fidelity from hiring and/or associating with individuals with certain Criminal Histories.