Data Engineer - Developer, Data Access Delivery
Cambridge, MA
.The Data Engineer will be part of a high performing Data Access team in the Data & Analytics CoE. He / She will be responsible for the delivery of data solutions in support of Commercial and Medical teams at Biogen. The data engineer will be responsible for architecting and building data platforms to facilitate end user data access across a wide range of business functions.

.The ideal candidate will have:
  • A minimum of 6 years of experience in building and architecting enterprise class large data warehouses, ETL both on premise as well as cloud
  • At least 3+ years hands-on experience with Big Data Cloud platform AWS Redshift is a must
  • Strong experience with Extract, Transform, Load (ETL) or ELT data ingestion, data pipelines, in both on premise and Cloud Platforms(Informatica/Oracle and AWS/Redshift)
  • Strong experience with data modeling, design patterns, building highly scalable Big Data Solutions and distributed applications
  • Good experience with programming/scripting languages such as Python/Shell Scripting/Scala(any combination).
  • Expertise with multiple AWS services and Hands-on AWS experience with a minimum of one to two referenceable implementations at enterprise scale
  • Experience in designing and configuring AWS services for Data warehouse data migration from on premise to cloud, and prior experience in understanding the nuances of moving data from RDBMS sources to Columnar DB .
  • Strong understanding/Experience in the AWS Cloud IaaS and PaaS services such as Redshift, EC2, S3, Lambda,RDS and using AWS CLI .Hands on with core AWS platform and security architecture, including: virtual private cloud, network design, subnets.
  • Good understanding on AWS security services like identity and access management (IAM), Role & Policy, key management service (KMS),and audit logging.
  • Experience with data integration technologies like Informatica Intelligent Cloud Services, is a plus
  • Hands-on development experience using open source big data components such as Hadoop (Hive, Pig, Spark, Kafka, Sqoop etc.) is nice to have
  • Be a team player and collaborate within and outside the team with stakeholders and flexible to adapt to team and business needs
  • Good communication skills to understand and translate business requirements into technical solutions
Other requirements:
  • Solid understanding on traditional data platforms and Big Data platforms
  • Ability to innovate and bring new ideas to speed up data access, create efficiencies in code and applications
  • Enhance team knowledge through your experience and innovation
  • Educate and share best practices on data management and data warehousing
  • Ability to think out of the box solutions and lead MSP resources in execution and delivery
  • Good communication skills – both verbal and written

.• Bachelor's or Master’s degree
  • Minimum of 6 years of software development experience
Employment CategoryFull-Time Regular
Experience Level

Mid-Senior Level