Databricks AWS Architect
Join a reputable financial institution as their skilled AWS Databricks Architect in a Remote-ok full-time role!

The Databricks AWS Architect builds secure, highly scalable big data solutions to achieve tangible, data-driven outcomes all the while keeping simplicity and operational effectiveness in mind. This role collaborates with teammates, product teams, and cross-functional project teams to lead the adoption and integration of the Databricks Lakehouse Platform into the enterprise ecosystem and AWS architecture. This role is responsible for implementing securely architected big data solutions that are operationally reliable, performant, and deliver on strategic initiatives.

 

Role & Responsibilities:

  • Work closely with team members to lead and drive enterprise solutions, advising on key decision points on trade-offs, best practices, and risk mitigation
  • Promote, emphasize, and leverage big data solutions to deploy performant systems that appropriately auto-scale, are highly available, fault-tolerant, self-monitoring, and serviceable
  • Use a defense-in-depth approach in designing data solutions and AWS infrastructure
  • Assemble large, complex data sets that meet functional and non-functional business requirements
  • Assist and advise data engineers in the preparation and delivery of raw data for prescriptive and predictive modeling
  • Aid developers to identify, design, and implement process improvements with automation tools to optimizing data delivery
  • Build infrastructure required for optimal extraction, loading and transformation of data from a wide variety of data sources
  • Work with the developers to maintain and monitor scalable data pipelines
  • Perform root cause analysis to answer specific business questions and identify opportunities for process improvement
  • Build out new API integrations to support continuing increases in data volume and complexity
  • Collaborate with Enterprise Digital Intelligence (Edi) team to improve data models that feed business intelligence tools, increase data accessibility, and foster data-driven decision-making across the organization
  • Implement processes and systems to monitor data quality and security, ensuring production data is accurate and available for key stakeholders and the business processes that depend on it
  • Employ change management best practices to ensure that data remains readily accessible to the business
  • Maintain tools, processes and associated documentation to manage API gateways and underlying infrastructure
  • Implement reusable design templates and solutions to integrate, automate, and orchestrate cloud operational needs

 

Required Skills & Experience:

  • 2+ years’ experience with Databricks
  • 3+ years’ of related experience in designing secure, scalable and cost-effective big data architecture
  • 5+ years’ experience in a software development, data engineering, or data analytics field using Python, Scala, Spark, Java, or equivalent technologies
  • Bachelor’s or Master’s degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience
  • Knowledge of different programming and scripting languages
  • Mid-level knowledge of code versioning tools [such as Git, Mercurial or SVN]
  • Expert proficiency in Python, C++, Java, R, and SQL
  • Proficiency in software engineering best practices employed in the software development lifecycle, including coding standards, code reviews, source control management, build processes, testing and operations
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • Strong understanding and knowledge of financial industry technology standards, compliance requirements, and experience working with audit and regulatory bodies
  • Mid-level experience of the AWS Cloud platform core foundational native services and expertise in the Big Data & AI first party services such as AWS Glue, Amazon Athena, Amazon Kinesis, AmazonQuickSight, Crawlers
  • Proficiency in working with all types of operating systems, especially Linux and Unix. RHEL RHCSA/RHCE certifications preferred.
  • Proficient level experience with architecture design, build and optimization of big data collection, ingestion, storage, processing, and visualization
  • Proficient in building, automating and deploying data pipelines and workflows into end-user facing applications
  • Ability to remain up to date with industry standards and technological advancements that will enhance data quality and reliability to advance strategic initiatives
  • Basic experience with or knowledge of agile methodologies
  • Technical expertise with data models, data mining, and segmentation techniques
  • Working knowledge of RESTful APIs, OAuth2 authorization framework and security best practices for API Gateways
  • Expert at diagnostic and problem resolution providing third-level support
  • Familiarity of working with unstructured data sets (i.e. voice, image, log files, social media posts, email)
  • Possess an organized methodical approach and bring a continuous improvement mindset
  • Demonstrated predisposition for action, willingness to partner, and innate drive to provide an exceptional member and employee experience
  • Highly creative and innovative technologist that thrives independently and collaborates well in a team environment

Preferred:

  • Experience in a financial institution
  • Expert-level knowledge of AWS infrastructure configurations and services offering
  • Expert-level knowledge of data frameworks, data lakes and open-source projects such as Apache Spark, MLflow, and Delta Lake
  • Experience with MDM using data governance solutions (Collibra preferred)
  • Advanced technical certifications: AWS Certified Data Analytics, DASCA Big Data Engineering and Analytics
  • AWS Certified Cloud Practitioner, Solutions Architect, Certified Developer, or SysOps Administrator certifications preferred

 

About Versique

 

Versique is a high-performance recruiting firm based in Minneapolis, MN specializing in interim solutions, direct hire, and executive leadership search. We believe people are the ultimate business advantage. Our experienced functional recruiting teams work within a variety of areas of expertise (HR, Finance & Accounting, Demand Generation, IT, and Engineering) and broad industries (Healthcare, Banking, Consumer Packaged Goods, Manufacturing, Private Equity, and Family-Owned). Voted as a “Star Tribune Best Places to Work” three years in a row by our employees, Versique is one of the largest and fastest growing staffing and recruiting firms in the Midwest. The Versique brand represents a powerful combination of “versatile” and “unique” as it hints at the concept of “search” in its pronunciation: ver-seek.

Versique is an equal opportunity employer committed to creating a diverse workforce. We consider all qualified applicants without regard to race, religion, color, sex, national origin, age, sexual orientation, gender identity, disability, or veteran status, among other factors.

#LI-BK1

 



Apply for
Databricks AWS Architect

















doc, docx, pdf, txt - Max File Size: 8mb