Data Engineer, Client Experience Platform - Associate

1 month ago
Employment Information
Description

About this role

When BlackRock was started in 1988, its founders envisioned a company that combined the best of financial services with cutting edge technology. They imagined a business that would provide financial services to clients as well as technology services to other financial firms. The result of their vision is Aladdin, our industry leading, end-to-end investment management platform. With assets valued over USD $10 trillion managed on Aladdin, our technology empowers millions of investors to save for retirement, pay for college, buy a home and improve their financial wellbeing.

Data is at the heart of Aladdin and increasingly the ability to consume, store, analyze and gain insight from data has become a key component of our competitive advantage.

The Client Experience Platform team is focused on developing integrated experiences across desktop, web, and mobile channels for clients, regulators, engineers, sales, and service professionals. The mission of this team is to create a best in-class experience across the horizontal client journey.

We are looking for talented Software Engineers that will architect and build the Client Data Platform on the Cloud which will manage and house a 360 degree view of our CRM, Sales and Service echo system. The candidate will focus on building scalable data ingestion pipelines and conforming and transforming the data to support analytical use cases consisting of data of the highest quality for all users of the platform, notably Global Client Business, US Wealth Advisory, Executive Teams and Data Scientists.

Software engineers at BlackRock get to experience working at one of the most recognized financial companies in the world while being part of a software development team responsible for next generation technologies and solutions. Our engineers design and build large scale data storage, computation and distribution systems.

Description:

As Senior Data Engineer, you will…

  • Improve BlackRock’s ability to enhance our retail sales distribution capabilities and services suite by creating, expanding and optimizing our data and data pipeline architecture.
  • You will create and operationalize data pipelines to enable squads to deliver high quality data-driven product.
  • You will be accountable for managing high-quality datasets exposed for internal and external consumption by downstream users and applications.
  • Top technical / programming skills – Python, Java or Scala, Hadoop Suite, Cloud Data Platforms Preferably Snowflake and SQL. Experience working with flat files (e.g., csv, tsv, Excel), Database API sources is a must to both ingest and create transformations.
  • Given the highly execution-focused nature of the work, the ideal candidate will roll up their sleeves to ensure that their projects meet deadlines and will always look for ways to optimize processes in future cycles.
  • The successful candidate will be highly motivated to create, optimize, or redesign data pipelines to support our next generation of products and data initiatives.
  • You will be a builder and an owner of your work product.

Responsibilities:

  • Lead in the creation and maintenance of optimized data pipeline architectures on large and complex data sets.
  • Assemble large, complex data sets that meet business requirements.
  • Act as lead to identify, design, and implement internal process improvements and relay to relevant technology organization.
  • Work with stakeholders to assist in data-related technical issues and support their data infrastructure needs.
  • Automate manual ingest processes and optimize data delivery subject to service level agreements; work with infrastructure on re-design for greater scalability.
  • Keep data separated and segregated according to relevant data policies.
  • Work with data scientists to develop data ready tools to support their job.
  • Be up-to-date with the latest tech trends in the big-data space and recommend them as needed.
  • Identify, investigate, and resolve data discrepancies by finding the root cause of issues; work with partners across various cross-functional teams to prevent future occurrences.

Qualifications:

  • Overall 6+ years of hands-on experience in computer/software engineering with majority in big data engineering.
  • 5+ years of strong Python or Scala programming skills (Core Python and PySpark) including hands-on experience creating and supporting UDFs and modules like pytest.
  • 5+ years of experience with building and optimizing ‘big data’ pipelines, architectures, and data sets. Familiarity with data pipeline and workflow management tools (e.g., Airflow, DBT Kafka).
  • 5+ years of hands-on experience on developing on Spark in a production environment. Expertise on parallel execution, deciding resources and different modes of executing jobs is required.
  • 5+ years of experience using Hive (on Spark), Yarn (logs, DAG flow diagrams), Sqoop. Proficiency bucketing, partitioning, tuning and handling different file formats (ORC, PARQUET & AVRO).
  • 5+ years of experience using Transact SQL (e.g., MS SQ Server, MySQL), No-SQL and GraphQL.
  • Strong experience implementing solutions on Snowflake
  • Experience in deployment, maintenance, and administration tasks related to Cloud (Azure Preferred), OpenStack, Docker, Kafka and Kubernetes
  • Experience with working with global teams across different time zones
  • Plus – Experience with Machine Learning and Artificial Intelligence
  • Plus – Experience with Generative Artificial Intelligence

Our benefits

To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.

Our hybrid work model

BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock.

About BlackRock

At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress.

This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive.

Skills
Data Engineer
C2CHires - Best site for all Contract Job

New Things Will Always
Update Regularly

C2CHires - Best site for all Contract Job