PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…
ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of Databricks Administrator/Architect:
-
Employment Type:
Contractor
-
Location:
Raleigh, NC (Onsite)
Do you meet the requirements for this job?
Databricks Administrator/Architect
BCforward is currently seeking a highly motivated Technical Specialist in Raleigh, NC 27610
Position Title: Technical Specialist- Junior
Location: NC 27610
Shift Timing: Mon Friday 40.00 Hours Per Week
Anticipated start date: 1-2 Weeks
Duration: 12 Years contract and the possibility to be extended the contract.
Pay Rate: $70.00 - $74.00 on W2
This role is hybrid and there will be occasional need to be onsite.
CANDIDATES CURRENTLY LIVING IN THE RALEIGH/DURHAM/CHAPEL HILL, NC AREA.
Duties and Responsibilities:
*Provide mentorship guidance overall knowledge share and support to team members promoting continuous learning and development.
*Oversee the design implementation and maintenance of Databricks clusters.
*Ensure the platforms scalability performance and security.
*Provide escalated support and troubleshooting to users.
*Oversee maintenance of role-based access to data and features in the Databricks Platform using Unity Catalog.
*Review clusters health check and best practices implementation.
*Review and maintain documentation for users and administrators.
*Design and implement tailored data solutions to meet customer needs and use cases spanning from ingesting data from APIs building data pipelines analytics and beyond within a dynamically evolving technical stack.
*Work on projects involving on-prem data ingestion into Azure using ADF.
*Build data pipelines based on the medallion architecture that clean transform and aggregate data from disparate sources.
Skill - Required / Desired
Extensive hands-on experience implementing Lakehouse architecture using Databricks Data Engineering platform, SQL Analytics, Delta Lake, Unity Catalog Required
Strong understanding of Relational & Dimensional modeling. Required
Demonstrate proficiency in coding skills - Python, SQL, and PySpark to efficiently prioritize perf, security, scalability, robust data integrations. Required
Experience implementing serverless real-time/near real-time arch. using Cloud (i.e., Azure, AWS, or GCP Tech Stack), and Spark tech (Streaming & ML) Required
Experience Azure Infra config (Networking, architect and build large data ingestion pipelines and conducting data migrations using ADF or similar tech -Required
Experience working w/ SQL Server features such as SSIS and CDC. Required
Experience with Databricks platform, security features, Unity Catalog, and data access control mechanisms. Required
Experience with GIT code versioning software. Required
Databricks Certifications Desired
Interested candidates please send resume in Word format Please reference job code 230943 when responding to this ad.