Job Description:
Position Description:
Designs and architects complex software solutions on a platform dedicated to high-scale processing of large amounts of data using Java, J2EE, Spring MVC, Spring Core, JavaScript, relational databases, and NoSQL database management systems. Creates and improves upon Application Program Interface (API) design components, including: versioning, isolation, and micro-services. Employs Application Programming Interface (API) documentation frameworks -- Swagger. Develops robust web applications using technical tools -- JSP, HTML, CSS/SASS, JQuery, Angular, and NodeJS. Builds automation pipelines to enhance application efficiency, performance, and agility using DevOps processes and Continuous Integration and Continuous Delivery (CI/CD) tools -- Maven, Jenkins, Stash, Ansible, and Docker.
Primary Responsibilities:
- Deploys open-source technical streaming products via micro-services, Message-Oriented Middleware, Stream Processing, and Master Data Management.
- Enhances public and private cloud capabilities and components, such as compute, storage, database, and analytics, within cloud environments -- Amazon Web Services (AWS) and Azure.
- Develops technical prototypes and orchestrates software solutions via iterative approaches using testing frameworks -- Junit, Mockito, and Spring Test.
- Manages complex software projects and initiatives using Agile Software Development Lifecycle (SDLC) methodologies (SCRUM).
- Designs, develops and modifies complex and major software systems, uses scientific analysis and mathematical models to predict and measure outcome and consequences of design.
- Develops and directs software system testing and validation procedures, programming, and documentation.
Education and Experience:
Bachelor’s degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Information Management, Business Administration, or a closely related field and six (6) years of experience as a Director, Full Stack Engineering (or closely related occupation) executing data modeling, developing Application Programming Interfaces (APIs), and building data capabilities to scale within a financial services environment using distributed data technologies.
Or, alternatively, Master’s degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Information Management, Business Administration, or a closely related field and four (4) years of experience as a Director, Full Stack Engineering (or closely related occupation) executing data modeling, developing Application Programming Interfaces (APIs), and building data capabilities to scale within a financial services environment using distributed data technologies.
Skills and Knowledge:
Candidate must also possess:
- Demonstrated Expertise (“DE”) developing highly available, low latency, and distributed database applications using Oracle, PostgreSQL, and DynamoDB; executing performance tuning and building Relational and Non-Relational Data models (conceptual, logical, and physical) for Operational and Analytic data stores using Entity Relationship, Dimensional, and Data vault modeling techniques.
- DE building API contracts for REST and GraphQL APIs; developing applications using micro-services architecture, using Java/J2EE, Drop wizard/Spring framework, and Apache Tomcat server; performing automated testing of customer facing applications using Postman collections; deploying REST APIs as Docker containers to Kubernetes based cloud environments in AWS; and configuring continuous build and automated deployment using GIT, Jenkins, Maven, and Artifactory.
- DE architecting cloud native Storage, Compute, and Streaming solutions using S3, RDS, EC2, Lambda, Glue, AWS Kafka, Kinesis, SQS, and SNS; developing data streaming processes to enable real-time data integration between internal and external systems, using opensource systems -- Kafka and Kafka connect; and performing application log collection and monitoring using Elastic Search, Logstash, and Kibana (ELK) stack.
- DE using Snowflake implementations and design patterns; building ELT/ETL pipelines to move data to and from Snowflake; and streaming operational data in real-time to Enterprise Data Lake (Snowflake) for Analytics and Reporting.
#PE1M2
#LI-DNI
Certifications:
Category:
Information TechnologyFidelity’s hybrid working model blends the best of both onsite and offsite work experiences. Working onsite is important for our business strategy and our culture. We also value the benefits that working offsite offers associates. Most hybrid roles require associates to work onsite every other week (all business days, M-F) in a Fidelity office.
Please be advised that Fidelity’s business is governed by the provisions of the Securities Exchange Act of 1934, the Investment Advisers Act of 1940, the Investment Company Act of 1940, ERISA, numerous state laws governing securities, investment and retirement-related financial activities and the rules and regulations of numerous self-regulatory organizations, including FINRA, among others. Those laws and regulations may restrict Fidelity from hiring and/or associating with individuals with certain Criminal Histories.