تسجيل الدخول لسرعة الوصول إلى أفضل الصفقات. انقر هنا إذا لم يكن لديك حساب.

Data Engineer (Software Design) Full-time Job

hace 2 semanas Human Resources Valencia
Trabajo detalles

Moffatt & Nichol is renowned for specializing in large, complex waterfront infrastructure projects and is acknowledged as a global leader in this sector. We are currently seeking a Data Engineer / team in Valencia or Algeciras.

The role involves supporting data infrastructure and pipelines within a large-scale cloud environment, enabling efficient data ingestion, transformation, and storage to support analytical and operational needs. This position requires implementing robust ETL processes, optimizing data flows, and ensuring adherence to data engineering best practices and standards across microservices ecosystems

About Moffatt & Nichol:

Moffatt & Nichol is a premier U.S.-based global infrastructure advisor specializing in the planning and design of facilities that shape and serve our coastlines, harbors, and rivers. Demonstrating Moffatt & Nichol's dedication to design innovation since 1945, Engineering News-Record (ENR) has ranked the firm No. 1 in Marine and Port Facilities and among the Top 50 Designers in International Markets.

The company's professional staff comprises engineers, planners, scientists, and architects who cater to a global clientele from offices across Europe, the Americas, and the Pacific Rim. Moffatt & Nichol delivers customized services and a standard of excellence that has become the firm's signature across three primary practice areas—coastal, environmental, and water resources; ports and harbors; and transportation, bridges, and rail.

Duties and responsibilities:

Reasonable accommodations may be made to enable individuals with disabilities to perform these essential functions.

  • Design and implement big data architectures to support scalable ingestion, processing and storage large datasets efficiently by using Databricks and Azure.
  • Design, develop, and maintain robust data pipelines to support real-time and batch processing of great volumes of data.
  • Implement ETL processes to collect and transform data from various sources into usable formats.
  • Optimize ETL/ELT processes to move and transform data efficiently between cloud services, data lakes, and databases.
  • Optimize data workflows for performance, scalability, and cost-effectiveness.
  • Monitor and troubleshoot pipeline performance, identifying and resolving issues to ensure continuous data flow.
  • Work on database management, data storage, and data lake/warehouse solutions
  • This is a hybrid role and applicants would need to reside in either Valencia or Algeciras, ES

Duties and responsibilities:

Reasonable accommodations may be made to enable individuals with disabilities to perform these essential functions.

  • Design and implement big data architectures to support scalable ingestion, processing and storage large datasets efficiently by using Databricks and Azure.
  • Design, develop, and maintain robust data pipelines to support real-time and batch processing of great volumes of data.
  • Implement ETL processes to collect and transform data from various sources into usable formats.
  • Optimize ETL/ELT processes to move and transform data efficiently between cloud services, data lakes, and databases.
  • Optimize data workflows for performance, scalability, and cost-effectiveness.
  • Monitor and troubleshoot pipeline performance, identifying and resolving issues to ensure continuous data flow.
  • Work on database management, data storage, and data lake/warehouse solutions
  • Implement best practices for data governance, security, and compliance on cloud platforms.
  • Implementation of MLOPs best practices and tools to track and productionize predictive models.

Other duties:

Please note this job posting is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of the employee for this job. Duties, responsibilities and activities may change at any time with or without notice.

Qualifications:

  • Bachelor’s degree in Computer Science, Data Engineering, or a related field.
  • 3-6 years of experience in data engineering, with a focus on cloud platforms, specifically Azure and Databricks.
  • Strong hands-on experience with Apache Spark, Kafka, Flink, Data Lakes
  • Experience with NoSQL databases such as Hadoop.
  • Experience building real-time data streaming pipelines using Kafka.
  • Solid understanding of ETL/ELT processes, data transformation, and models productization.
  • Experience productionizing machine learning models with MlFlow.
  • Strong knowledge of data security and governance practices in the cloud.
  • All new hires will be required to successfully complete and pass a pre-employment (post offer) background check in compliance with NIST 800-171

Working conditions:

Work is generally performed in an office or client office setting. Ability to use standard office equipment is required. Travel may be required occasionally.


Moffatt & Nichol’s EEO Statement:

As a global business, Moffatt & Nichol relies on diversity of culture and thought to deliver on our goal of Creative People, Practical Solutions® serving our client needs, and ensures nondiscrimination in all activities. We continuously seek talented, qualified employees in our world-wide operations regardless of race, color, sex/gender, including gender identity and expression, sexual orientation, pregnancy, national origin, religion, disability, age, marital status, citizen status, protected veteran status, or any other protected classification under country or local law. Moffatt & Nichol is proud to be an Equal Employment Opportunity/ Affirmative Action Employer/ Federal Contractor desiring priority referrals of all protected veterans for job openings.

If you need more information or require special assistance for persons with disabilities or limited English proficiency, please contact Human Resources at (562) 590-6500 or TTY/TDD users please call 711.