US
0 suggestions are available, use up and down arrow to navigate them
PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…
ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of Kakfa Developer:
-
Employment Type:
Contractor
-
Location:
Minneapolis, MN (Onsite)
Do you meet the requirements for this job?
Kakfa Developer
BCforward
Minneapolis, MN (Onsite)
Contractor
Kakfa DeveloperBCForward is currently seeking for a highly motivated Kafka Developer for a Remote Opportunity. Position Title: Kafka DeveloperLocation: RemoteAnticipated Start Date: 10/21/2024Please note this is the target date and is subject to change. BCforward will send official notice ahead of a confirmed start date.Expected Duration: 6+ monthsJob Type: Contract - RemotePay Range: $80/hr - $85/hrPlease note that actual compensation may vary within this range due to factors such as location, experience, and job responsibilities, and does not encompass additional non-standard compensation (e.g., benefits, paid time off, per diem, etc.). One of our industry leading clients are seeking an experienced Confluent Cloud Kafka Developer to design, develop, and manage scalable, real-time data streaming solutions using Apache Kafka on Confluent Cloud. The ideal candidate will work with data engineering teams to architect streaming data pipelines, integrate diverse data sources, and optimize Kafka for high-performance applications. This role requires strong expertise in Apache Kafka, Confluent Cloud, and distributed streaming platforms, with a focus on real-time integrations. Required Qualifications:
- 3+ years of experience in software development, with a strong focus on Apache Kafka and distributed systems.
- 2+ years of experience in Python (preferred) or Java for building Kafka applications
- Proven experience with Confluent Kafka, including Kafka Connect, Schema Registry, Apache Flink and KSQL.
- Experience with Infrastructure as Code (IaC) tools, particularly Terraform.
- Hands-on experience with AWS services (e.g., EC2, S3, IAM, Lambda) and Snowflake.
- Solid understanding of data integration, ETL processes, and data pipeline orchestration.
- Experience with operational monitoring and performance optimization of streaming pipelines.
- Ability to troubleshoot and resolve complex technical issues related to Kafka and its ecosystem.
- Strong analytical and problem-solving skills, with attention to detail.
- Excellent verbal and written communication skills.
- Ability to work effectively in a team environment and collaborate with cross-functional teams.
Preferred Qualifications:
- 2+ years of experience working with Confluent Cloud and implementing Kafka-based solutions.
- Experience with other data streaming tools and platforms.
- Familiarity with DevOps tools and practices, including CI/CD, Docker, and Kubernetes.
- Certification in AWS, Kafka, or related technologies.
Interested candidates please send resume in Word format Please reference job code 230009 when responding to this ad.
Get job alerts by email.
Sign up now!
Join Our Talent Network!