Location: Gurugram

Job ID: ML2011

Job Type: Full time

Experience: 4+ years

About the company:

This role is a part of Pickl.AI, the education brand of TransOrg Analytics. TransOrg Analytics is over a decade old Data Science and ML company.

The role

Pickl.AI (TransOrg’s education brand) is looking for a Project Associate who is technically immersed in data science as a subject. We are looking for a creative intern who wants to accelerate their exposure to many areas in ML, loves wearing multiple hats and can take full ownership of their work.

Responsibilities

  • Design the data pipelines and engineering infrastructure to support our clients’ enterprise machine learning systems at scale
  • Take offline models built by data scientists and turn them into a real machine learning production system
  • Develop and deploy scalable tools and services for our clients to handle machine learning training and inference
  • Identify and evaluate new technologies to improve performance, maintainability, and reliability of our clients’ machine learning systems
  • Apply software engineering rigor and best practices to machine learning, including CI/CD, automation, etc.
  • Support model development, with an emphasis on auditability, versioning, and data security.
  • Facilitate the development and deployment of proof-of-concept machine learning systems.
  • Exposure to LLMs & deployment methodologies will be a plus

What are we looking for?

  • Bachelor’s in Computer Science, Engineering, Statistics, Math’s or related quantitative degree.
  • 4+ years of experience building end-to-end systems as a Platform Engineer, ML DevOps Engineer, or Data Engineer (or equivalent)
  • Know machine learning algorithms and frameworks – TensorFlow, PyTorch
  • Hands on with Python/Java
  • Deploy your code on AWS/GCP/Azure
  • Have a good grasp of CI/CD pipelines, IaC (Infrastructure-as-code) tools (like Terraform, Cloud Formation)
  • Have experience in working with relational and non-relational databases, data warehousing, and data streaming frameworks (think Apache Kafka/Spark/SQL)
  • Familiar with concepts like firewalls, encryption, VPNs, and secure data transfer.
  • Have worked on Logging tools like Prometheus, and ELK Stack
  • Experience developing and maintaining ML systems built with open source tools
  • Experience developing with containers and Kubernetes in cloud computing environments
  • Familiarity with one or more data-oriented workflow orchestration frameworks (KubeFlow, Airflow, Argo, etc.)
  • Exposure to deep learning approaches and modeling frameworks (PyTorch, Tensorflow, Keras, etc.)

To apply for any open positions please share your updated CV along with job id at: careers@transorg.com