PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…
ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of Databricks Platform Engineer:
-
Employment Type:
Full-Time
-
Location:
Philadelphia, PA (Onsite)
Do you meet the requirements for this job?

Databricks Platform Engineer
Job Title: Databricks Platform Engineer
Location: Remote, East Coast preferred
Duration: Contract - 9 months
Pay Range: $60/hr. to $64/hr. (W2)
Job ID: 372526
About BCforward
BCforward is a leading global IT consulting and workforce solutions firm providing services and support to Fortune 500 and government clients. Founded in 1998, BCforward has grown with our customers needs into a full-service business solutions provider. With delivery centers and offices across North America and India, we take pride in building long-term relationships and delivering excellence through innovation, collaboration, and integrity.
Job Description
We are seeking a Databricks Platform Engineer on AWS to join our dynamic team. The ideal candidate will have strong experience in Databricks on AWS, PySpark, SQL, Delta Lake, Unity Catalog, and Git-based development and a proven ability to implement scalable ingestion frameworks, operationalize observability, and deliver reliable CI/CD automation aligned to platform standards.
Responsibilities:
- Implement metadata-driven ingestion frameworks and standard ingestion patterns.
- Build batch, incremental, and CDC data ingestion pipelines.
- Enforce platform standards for configuration, schema evolution, tagging, and error handling.
- Implement monitoring, metrics, and alerting for data pipelines.
- Support operational validation, failure simulation, and recovery testing.
- Embed data quality rules and quarantine patterns in ingestion processes.
- Apply Unity Catalog tags such as domain, owner, sensitivity, and freshness at ingestion time.
- Ensure governance metadata is written to governed schemas.
- Build CI/CD pipelines using GitHub Actions, including validation, promotion, and rollback.
- Manage secrets via AWS Secrets Manager and support environment automation.
Required Skills & Qualifications:
- 3-5 years of data or platform engineering experience.
- 3+ years of hands-on Databricks experience on AWS.
- Strong proficiency with PySpark and SQL.
- Experience with Delta Lake, Unity Catalog, and Git-based development workflows.
Preferred Skills:
- Hands-on experience building ingestion frameworks.
- Monitoring and alerting implementation for data platforms.
- Production platform support and on-call readiness.
Success Measures:
- Stable, reusable ingestion frameworks adopted across teams.
- Reduction in custom pipeline development through standardization.
- Reliable CI/CD with proactive operational monitoring and timely remediation.
Why BCforward?
At BCforward, we believe in advancing lives and careers. When you join our team, you gain access to:
- Competitive compensation and benefits.
- Opportunities for growth with global clients.
- A supportive, inclusive culture that values innovation and people.
- Exposure to cutting-edge technologies and projects.
About Our Commitment
BCforward is an equal opportunity employer. We value diversity and are committed to creating an inclusive environment for all employees. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation, gender identity, national origin, age, disability, or veteran status.
Interested? Apply Now!
If this sounds like the right opportunity for you, please apply with your most recent resume.