US
0 suggestions are available, use up and down arrow to navigate them
PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…
ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of Data Engineer:
-
Employment Type:
Contractor
-
Location:
Redmond, WA (Onsite)
Do you meet the requirements for this job?

Data Engineer
BC Forward
Redmond, WA (Onsite)
Contractor
$45.00 - $50.00/Hour
Data EngineerBCforward is currently seeking a highly motivated Data Engineer remote opportunity. Title: Data EngineerLocation: RemoteDuration: 4 Months (Potential chances for extension) Anticipated Start Date: ASAPPlease note this is the target date and is subject to change. BCforward will send official notice ahead of a confirmed start date.Expected Duration: 5 Months (Potential chances for extension)Job Type: [Fulltime] (40 HRS WEEKLY)], [CONTRACT], [REMOTE]Pay Range: $45/hr to $50/hr on W2Please note that actual compensation may vary within this range due to factors such as location, experience, and job responsibilities, and does not encompass additional non-standard compensation (e.g., benefits, paid time off, per diem, etc.). Title: Data EngineerLocation: RemoteDuration: 4 Months (Potential chances for extension) Job Summary:As a Software engineer - Data, candidate will be responsible for designing, developing, and maintaining efficient and reliable data pipelines. The candidate will work closely with stakeholders across the company to gather business requirements, build data models, and ensure data quality and accessibility. Expertise in Python, SQL, Airflow, and Spark will be crucial in optimizing our data infrastructure and enabling data-driven decision-making. Typical Day in the Role
- Purpose of the Team: The purpose of this team is to be responsible for designing, developing, and maintaining data platforms.
- Key projects: This role will contribute to designing, developing, and maintaining efficient and reliable data pipelines.
Candidate Requirements:
- Disqualifiers: Candidates with low tenure and constant job hopping will not be eligible for the role.
- Degree or Certification: Bachelor's degree in computer science, Engineering, Information Systems, or a related field is preferred.
- Data Platform: Design, build, and maintain scalable data platform and pipelines using Python, SQL, Airflow, and Spark.
- Business Requirements Gathering: Collaborate with stakeholders to understand and translate business requirements into technical specifications.
- Data Modeling: Develop and implement data models that support analytics and reporting needs.
- Data Quality and Governance: Ensure data accuracy, consistency, and reliability by implementing robust data validation and quality checks.
- Stakeholder Collaboration: Work with cross-functional teams, including data analysts, data scientists, and business leaders, to deliver high-quality data solutions.
- Performance Optimization: Continuously monitor and optimize data pipelines for performance, scalability, and cost-efficiency.
- Monitoring and Observability: Build and implement monitoring and observability metrics to ensure data quality and detect anomalies in data pipelines.
- Documentation and Communication: Maintain clear and comprehensive documentation of data processes and communicate technical concepts effectively to non-technical stakeholders.
- Experience: 2 years of experience in data engineering and infrastructure.
- Technical Skills: Proficiency in data warehouse management, Python, SQL, Airflow, and Spark.
- Data Pipeline Expertise: Strong experience in building and maintaining robust data pipelines and ETL processes.
- Analytical Skills: Ability to gather business requirements, debug issues for ingestion or any other areas of the data warehouse.
- Communication: Excellent verbal and written communication skills, with the ability to convey technical information to non-technical audiences.
- Collaboration: Proven ability to work effectively in a collaborative, cross-functional environment.
- Education: Bachelor's degree in computer science, Engineering, Information Systems, or a related field.
- Experience with cloud platforms such as AWS, GCP, or Azure.
- Familiarity with data warehousing technology (e.g., Delta Lake, Azure Fabric, Snowflake, Redshift, Big Query).
- Knowledge of data governance and data security best practices.
- Python
- SQL
- Kubernetes
- Airflow
- Scala
Interested candidates please send resume in Word format Please reference job code 250208 when responding to this ad.
Get job alerts by email.
Sign up now!
Join Our Talent Network!
Salary Details
This salary was provided in the Job Posting.
$45-$50
Hourly Salary