The Big Data Developer plays an important role in the delivery of exceptional service levels. This role assists other developers and Enterprise Digital Intelligence cohort with analyzing, building, and testing data development. Additionally, this role will assist in the programming of existing applications, as well as aid in the conceptualization and development of new software.
Role & Responsibilities:
- Develop Business Intelligence solutions for the enterprise
- Build ELT procedures for real time and near real time data automation to support a real time ongoing analytics process at the enterprise and department level
- Implement learning models using Databricks, H20.ai or similar frameworks
- Partner with RPA team to build service bots using advanced RPA techniques
- Review requirements created by requesters; ask any necessary clarifying questions before starting on work
- Lead and manage end-to end life cycle for the production of data; through a project conception to finished product
- Provide accurate work estimates and complete delivery of changes on schedule
- Participate as a team member on Enterprise BI high value projects
- Work with IT Management and Business Intelligence teams to develop programming templates and standards consistent with industry best practice
- Collaborate with appropriate stakeholders (e.g. IT staff, department managers, data engineers, data scientist, end users) to gather requirements, develop specifications, author, debug, test and document program code
- Create testable requirements and write and execute unit tests and system automation
- Perform DevOps activities as a part of the release and performance management processes
- Share knowledge and provide technical guidance and mentorship to less experienced developers
Required Skills & Experience:
- 2+ years of strong programming /scripting skills with R, Python, PySpark, Scala
- 2+ years of experience with open source ML tools like Tensorflow, PyTorch, scikit
- Intermediate experience with AWS cloud libraries including Lambda functions.
- Intermediate Experience with ELT, Databricks, Delta lake, datalake and AWS cloud technologies
- Proficient experience in integrating PowerBI and ML tools in cloud environment
- Familiarity of NoSql data management systems (HBase, Cassandra, AWS S3 etc)
- Familiarity with OCR libraries such as Tesseract, PyOCR, OpenCV, .NET OCR SDK etc)
- Ability to stay abreast of developments and programming languages
- Proven ability to troubleshoot complex technical problems and cooperatively resolve them using personal expertise as well as internal and external resources
- Advanced experience in gathering requirements and authoring programming specifications
- Work experience in a financial institution
- Associate or Bachelor’s Degree in Computer Science
- Familiarity of RPA tools is a plus
Versique is a high-performance recruiting firm based in Minneapolis, MN specializing in interim solutions, direct hire, and executive leadership search. We believe people are the ultimate business advantage. Our experienced functional recruiting teams work within a variety of areas of expertise (HR, Finance & Accounting, Demand Generation, IT, and Engineering) and broad industries (Healthcare, Banking, Consumer Packaged Goods, Manufacturing, Private Equity, and Family-Owned). Voted as a “Star Tribune Best Places to Work” three years in a row by our employees, Versique is one of the largest and fastest growing staffing and recruiting firms in the Midwest. The Versique brand represents a powerful combination of “versatile” and “unique” as it hints at the concept of “search” in its pronunciation: ver-seek.
Versique is an equal opportunity employer committed to creating a diverse workforce. We consider all qualified applicants without regard to race, religion, color, sex, national origin, age, sexual orientation, gender identity, disability, or veteran status, among other factors.