Process Abstract:
We’re in the hunt for an skilled and dynamic Cloud (AWS) / Databricks Technical Architect to enroll in our crew. The best candidate may have intensive experience in designing, construction, and deploying cloud-based answers the use of AWS and Databricks, with a focal point on records engineering, analytics, and device studying. On this position, you are going to be answerable for using the structure and implementation of scalable, protected, and environment friendly cloud-based records answers that enhance the group’s data-driven projects.
Key Tasks:
- Lead the design and implementation of cloud-based records architectures on AWS, using a variety of AWS services and products (e.g., S3, EC2, Lambda, RDS, Redshift, Athena, Glue).
- Architect and deploy scalable, protected, and cost-effective Databricks environments to procedure extensive volumes of knowledge for analytics, records engineering, and device studying.
- Supply management in designing fashionable records architectures, together with real-time records pipelines, ETL/ELT workflows, and large records processing programs the use of Databricks and AWS applied sciences.
- Outline and put in force perfect practices for managing and optimizing records lakes, records warehouses, and information pipelines.
- Ensure that structure choices align with industry necessities, safety insurance policies, and compliance requirements.
Necessities
- AWS Certification: Take care of and leverage AWS certification to design and put in force cloud answers.
- Cloud Structure: Design and optimize cloud structure for scalability and potency.
- Bins & Orchestration: Put into effect bins and orchestration gear for streamlined utility deployment.
- Microservices Structure: Design and organize microservices structure for versatile and scalable programs.
- Cloud Atmosphere Setup and Configurations: Arrange and configure cloud environments to satisfy undertaking necessities.
- Safety & Get entry to Control: Ensure that protected get right of entry to control and compliance inside of cloud environments.
- SQL, Python, Visualization & Analytical Gear: Use SQL, Python, and analytical gear for records processing and visualization.
- API Building & Control: Expand and organize APIs for seamless records integration and capability.
Schooling & Revel in:
- Bachelor’s or Grasp’s level in Pc Science, Data Generation, Knowledge Engineering, or a comparable box.
- 8+ years of revel in in cloud structure, records engineering, or a identical technical position, with no less than 5 years of hands-on revel in running with AWS and Databricks.
- Confirmed observe file in architecting and deploying large-scale records engineering answers on AWS and Databricks.
- Revel in running with more than a few records processing frameworks (e.g., Apache Spark, Apache Kafka, Airflow) and cloud-based records garage answers.
Technical Abilities & Competencies:
- Deep experience in AWS services and products, together with however no longer restricted to S3, EC2, Lambda, Glue, Redshift, Athena, and RDS.
- Sturdy revel in with Databricks, together with pocket book introduction, Spark-based processing, and managing Databricks clusters.
- Experience in records engineering ideas, together with ETL/ELT, records lakes, records pipelines, and real-time streaming architectures.
- Skillability in programming languages comparable to Python, Scala, SQL, or Java for records processing and resolution construction.
- Revel in with DevOps practices, CI/CD pipelines, containerization (e.g., Docker, Kubernetes), and infrastructure as code (e.g., Terraform, CloudFormation).
- Familiarity with device studying workflows and gear, specifically those who combine with Databricks (e.g., MLflow, Spark MLlib).
- Sturdy figuring out of cloud safety perfect practices, together with IAM, encryption, and community safety.
Most popular {Qualifications}:
- Revel in with giant records frameworks (e.g., Hadoop, Spark) and container orchestration platforms (e.g., Kubernetes).
- Familiarity with records governance, privateness, and compliance frameworks.