Irvine, CA

Data Engineer

Position Overview

We're looking for a seasoned Data Engineer to help our team build massively scalable consumer systems for clean energy. As a Data Engineer, you'll help lead technical direction for respective engineering solutions at Hanwha Q Cells. As a Data Engineer, you'll help lead technical direction for all data engineering solutions at Hanwha Q Cells and will also lead the way in transforming our systems to self-healing, reliable and reactive systems.

Want to work on massively scalable consumer systems? Re-invent how consumers use energy? Have a measurable impact on one of humanity's biggest challenges? Come to Hanwha Q Cells. You'll work alongside passionate engineers engaged in the design and development of a product that is changing the world.

RESPONSIBILITIES
    • Responsible for data development, modeling, simulation, testing, and quality assurance.
    • Analyze user requirements, create technical specifications, writing and test code. Maintain systems by monitoring and correcting defects. Completing systems risk and reliability analysis.
    • Monitor systems performance, perform maintenance and data integrations for existing systems, and maintain compliance with industry standards.
    • Assemble large, complex sets of data that meet non-functional and functional business requirements
    • Identify, design and implement internal process improvements including re-design infrastructure for greater scalability, optimize data delivery, and automate manual processes
    • Build required infrastructure for optimal extraction, transformation, and loading of data from various data sources using AWS and SQL technologies
    • Build analytical tools to utilize the data pipeline, provide actionable insight into key business performance metrics including operational efficiency and customer acquisition
    • Work with stakeholders including data, design, product and executive teams and assist them with data-related technical issues
    • Work with stakeholders including the Executive, Product, Data and Design teams to support their data infrastructure needs while assisting with data-related technical issues
    • The Data Engineer should continually update technical knowledge and skills by attending in-house and external courses, reading manuals, and accessing new applications.
    • Good security practices and experience writing code that manages customer data.
    • Impeccable communication and team skills with shared ownership of code and other deliverables.
    • Willingness to work with and learn new technologies.
    • Excellent communication skills, which is essential to execute his duty to the juniors in the team.
    • Technical Skills Needed


REQUIREMENT
    • BS/MS in Computer Science, Engineering or Math is preferred.
    • 5+ years experience designing and coding enterprise-level applications.
    • Strong experience developing complex enterprise applications with Java, Python, and/or NodeJS.
    • Experience with REST API architecture and development especially using Swagger.
    • Strong knowledge of Git including version control, branching, merging/rebasing, and pull requests.
    • Strong focus on automation including Continuous Integration / Deployment with writing unit and integration tests.
    • Hands-on experience with SQL database design
    • Demonstrated understanding and experience using software and tools including big data tools like Kafka, Spark and Hadoop; relational NoSQL and SQL databases including DynamoDB and Postgres
    • Experience with the Java/Kotlin and/or Python languages.
    • Experience architecting solutions in GCP and/or AWS specifically using workflow management and pipeline tools such as Airflow, Luigi and Azkaban;
    • Familiarity with managed cloud services (e.g. GCP, AWS, Azure) and their associated offerings.
    • Candidates must possess ample knowledge and experience in system automation, deployment, and implementation.
    • Candidates must possess experience in using JIRA, Jenkins, Github, Github Actions and ample experience in configuring and automating the monitoring tools.
    >


PREFERRED QUALIFICATIONS
    • Familiarity with Docker and Serverless architectures like GCP Cloud Functions, AWS Lambda, DynamoDB, ECS, S3, GCP Pubsub, SQS, CloudFormation, Terraform, and/or other similar cloud services.
    • Experience with Maven/Gradle build systems.
    • Experience with Github Actions / Jenkins CI/CD pipelines.
    • Understanding of BFF (Backend-for-Frontend) patterns.
    • Experience with the development of self-healing, reliable and reactive systems.
    >


Physical, Mental, & Physical Demands

To comply with the Rehabilitation Act of 1973 the essential physical, mental and environmental requirements for this job are listed below. These are requirements normally expected to perform regular job duties. Incumbent must be able to successfully perform all of the functions of the job with or without reasonable accommodation.

Mobility

Standing: 20% of time

Sitting: 70% of time

Walking: 10% of time

Strength

Pulling: up to 10 Pounds

Pushing: up to 10 Pounds

Carrying: up to 10 Pounds

Lifting: up to 10 Pounds

Agility (F = Frequently, O = Occasionally, N = Never)

Turning: F

Twisting: F

Bending: O

Crouching: O

Balancing: N

Climbing: N

Crawling: N

Kneeling: N

Dexterity (F = Frequently, O = Occasionally, N = Never)

Typing: F

Handling: F

Reaching: F

Recommended Skills

  • Aws Lambda
  • Amazon Dynamo Db
  • Amazon S3
  • Apache Hadoop
  • Apache Kafka
  • Apache Maven
Browse other jobs