Snowflake / Databricks Developer

Software Development
  • Software Development
  • India

Optimum Solutions

You will be responsible for working on a Banking Application project as a Data Migration / Snowflake / Databricks Developer in Chennai. Your main responsibilities will include:

– Creating, testing, and implementing enterprise-level apps with Snowflake
– Designing and implementing features for identity and access management
– Developing authorization frameworks for better access control
– Implementing query optimization and security competencies with encryption
– Solving performance and scalability issues in the system
– Managing transaction management with distributed data processing algorithms
– Owning the project right from start to finish
– Building, monitoring, and optimizing ETL and ELT processes with data models
– Migrating solutions from on-premises setup to cloud-based platforms
– Implementing the latest delivery approaches based on data architecture
– Documenting projects and tracking based on user requirements
– Integrating data with third-party tools including architecting, designing, coding, and testing phases
– Documenting data models, architecture, and maintenance processes
– Reviewing and auditing data models for enhancement
– Maintaining the data pipeline based on ETL tools
– Coordinating with BI experts and analysts for customized data models and integration
– Performing code updates, new code development, and reverse engineering
– Providing performance tuning, user acceptance training, and application support
– Ensuring confidentiality of data
– Conducting risk assessment, management, and mitigation plans
– Engaging with teams for status reporting and routine activities
– Performing migration activities from one database to another or on-premises to the cloud

Qualifications required for this role include:

– Minimum of 5+ years of intermediate-level experience
– Bachelor’s degree in computer science or equivalent practical experience
– Knowledge of SQL language and cloud-based technologies
– Expertise in data warehousing concepts, data modeling, and metadata management
– Familiarity with data lakes, multi-dimensional models, and data dictionaries
– Experience in migration to AWS or Azure Snowflake platform
– Proficiency in performance tuning and setting up resource monitors
– Skills in Snowflake modeling roles, databases, schemas, SQL performance measuring, query tuning, and database tuning
– Familiarity with ETL tools, integration with cloud-driven skills, and building analytical solutions and models in languages like Python, Java, and JavaScript
– Experience with Hadoop, Spark, and other warehousing tools
– Ability to manage sets of XML, JSON, and CSV from disparate sources
– Knowledge of SQL-based databases like Oracle, SQL Server, Teradata, etc.
– Understanding of Snowflake warehousing, architecture, processing, and administration
– Experience in data ingestion into Snowflake
– Exposure to enterprise-level technical applications of Snowflake

Please note that this role requires immediate joiners or candidates with a notice period of 1 month. You will be working from the office in Chennai (Ramanujam IT Park Tharamani) from Monday to Friday, 9:00 AM to 6:00 PM. You will be responsible for working on a Banking Application project as a Data Migration / Snowflake / Databricks Developer in Chennai. Your main responsibilities will include:

– Creating, testing, and implementing enterprise-level apps with Snowflake
– Designing and implementing features for identity and access management
– Developing authorization frameworks for better access control
– Implementing query optimization and security competencies with encryption
– Solving performance and scalability issues in the system
– Managing transaction management with distributed data processing algorithms
– Owning the project right from start to finish
– Building, monitoring, and optimizing ETL and ELT processes with data models
– Migrating solutions from on-premises setup to cloud-based platforms
– Implementing the latest delivery approaches based on data architecture
– Documenting projects and tracking based on user requirements
– Integrating data with third-party tools including architecting, designing, coding, and testing phases
– Documenting data models, architecture, and maintenance processes
– Reviewing and auditing data models for enhancement
– Maintaining the data pipeline based on ETL tools
– Coordinating with BI experts and analysts for customized data models and integration
– Performing code updates, new code development, and reverse engineering
– Providing performance tuning, user acceptance training, and application support
– Ensuring confidentiality of data
– Conducting risk assessment, management, and mitigation plans
– Engaging with teams for status reporting and routine activities
– Performing migration activities from one database to another or on-premises to the cloud

Qualifications required for this role include:

– Minimum of 5+ years of intermediate-level experience
– Bachelor’s degree in computer science or equiva

To apply for this job please visit www.shine.com.

Similar Jobs to Apply
  • Achutha Associates
    India

    Locations: Remote Time : 9:00PM or 10:00PM to 5:00AM or 6:00AM (PST time) Interview round : 5 Budget : 1LPM (No GST) EXP : 10 + Year Note : Don’t look for candidate based on specific requirements. I
  • Nokia
    India

    In this role, you will lead the design, development, and execution of manual and automated test cases for telecom applications using scripting languages such as Python, Tcl, or Ksh/Bash. Collaborate c
  • DHI Solutions
    India

    Job Title : Webflow Developer Location : Gurgaon Employment Type : Full-time Experience : 3+ years (preferred) Role Description We are seeking a talented and detail-oriented developer who excels
  • Infosys
    India

    You have experience with most of these technologies: HDFS, Ozone, Hive, Impala, Spark, Atlas, Ranger. Knowledge of GraphQL, Venafi (Certificate Mgt) and Collibra (Data Governance) is an asset. • Exp