Location: Gurgaon/Mumbai/Hyderabad

Job ID: DE2009

Job Type: Full time

Experience: 10-15 years

Why would you like to join us?

TransOrg Analytics, an award winning – Big Data and Predictive analytics- company, offers advanced analytics solutions to industry leaders and Fortune 500 companies across India, US, UK, Singapore and the Middle East. Our products – Clonizo’ (customer cloning) have yielded significant incremental benefits to our clients. We have been recognized by the CIO Review magazine as the – Predictive Analytics Company of the Year- and by TiE for excellence in entrepreneurship.

Responsibilities

  • Manage 300 use case development and deployment over cloud.
  • Play a key role in cloud migration strategy with focus on single/multi cloud, best possible
    tools/service on cloud, design architecture.
  • Should be able to leverage Dev ops environment like GitHub, Docker, and Kubernetes etc. to deploy/orchestrate the models.
  • Establish rail roads of continuous data flow and transmission of insights to business specially in real time environment.
  • Help come up with framework on cost optimization while running various process on cloud basis analytics job execution.
  • Instrumental in designing dev ops capability of the analytics group, which can leads to making “data as first citizen” in the organization.
  • Regular interaction with technology, analytics and business stakeholders to ensure the analytics use case pipeline are prioritised basis organization priority and impact.
  • Have ability to deep dive on business problem and bring in multiple scenarios to solve a
    problem in specific area of focus.

Required Competency and Skill Set To Be a Waver

  • Bachelor’s degree in Computer Science, Information Technology, or a related field.
  • 10-15 yrs. experience in BFSI technology stack.
  • Experience in data architecture, database design, and data modeling.
  • Proven track record of designing and implementing complex data architectures using modern technologies such as cloud platforms (e.g., AWS, Azure, GCP, Snowflake), big data technologies (e.g., Redshift, BigQuery), and distributed computing frameworks (Spark).
  • Expertise in guiding, implementing and operationalizing AI/ML models using cloud tools such as SageMaker, Databricks etc.
  • Extensive experience with relational databases (e.g., PostgreSQL, MySQL), NoSQL databases (e.g., MongoDB, Cassandra)
  • Strong proficiency in data modeling techniques (e.g., ER modeling, dimensional modeling) and data modeling tools (e.g., ERwin, Erwin Data Modeler).
  • Hands-on experience with designing and implementing batch and real time-data pipelines, using integration tools (e.g., Apache Kafka, Fivetran etc) and ETL/ELT processes.
  • Solid understanding of data governance principles, data security best practices, and regulatory compliance requirements (e.g., GDPR, CCPA).
  • Excellent analytical and problem-solving skills, with the ability to troubleshoot complex data issues and optimize performance.
  • Strong communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and influence decision-making at all levels of the organization.
  • Certifications in relevant technologies (e.g., AWS Certified Solutions Architect, Microsoft Certified: Azure Data Engineer) are a plus.

To apply for any open positions please share your updated CV along with job id at: careers@transorg.com