Skip to main content
Search Jobs

Principal Software Engineer – Microscopy Data Management & Cloud Platform

Full time India - Remote Req ID JR-043859
Scientists wearing lab coats

Revvity | About Us

Revvity is a developer and provider of end-to-end solutions designed to help scientists, researchers, and clinicians solve the world’s greatest health challenges. We pair the enthusiasm of an industry disruptor with the experience of a longtime leader. Our team of 11,000+ colleagues from around the globe are vital to our success and the reason we’re able to push boundaries in pursuit of better human health.

Find your future at Revvity.

Role summary

We are looking for a hands-on Principal Software Engineer to design and implement a distributed scientific data platform for microscopy and life-science applications. The platform must support microscopy image data, metadata, search, transformation, storage, transfer and analysis workflows across instruments, user PCs, on-premise deployments and AWS cloud environments.

This role requires deep expertise in database-centric system design, high-throughput data pipelines, reliable data transfer protocols, backend interface design and scalable multi-user distributed systems.

You will

  • Contribute to the design and development of a distributed data management platform for microscopy and scientific imaging workflows.
  • Design scalable storage, indexing, search, caching and high-throughput data-transfer mechanisms for large scientific datasets in hybrid edge/on-prem/cloud environments.
  • Design and implement robust backend interfaces (APIs, contracts, schemas) enabling interoperability across platforms (instrument control, image analysis, UI, cloud services).
  • Design and implement reliable and resumable data transfer pipelines between instruments, local systems and cloud (including offline-first and intermittent connectivity scenarios).
  • Develop high-performance data ingestion, streaming and transformation pipelines for large image datasets and metadata.
  • Define and implement data consistency, integrity and traceability mechanisms (checksums, versioning, audit trails, reproducibility).
  • Collaborate with image analysis, web UI, instrument software and domain experts to define robust interfaces and end-to-end workflows.
  • Contribute to engineering standards, CI/CD, observability, reliability and secure software delivery for cloud-based and instrument-hosted systems.
  • Provide technical leadership through architecture decisions, hands-on implementation, design reviews, mentoring and cross-team alignment.

Must have

  • MS in STEM or equivalent practical experience
  • 10+ years in software engineering, including design of complex distributed backend or data platforms.
  • Strong programming skills in Python and C#.
  • Deep expertise in database design, schema evolution, query optimization, transactions, indexing, data security and data lifecycle management.
  • Strong experience with SQL and at least one of: document, key-value, graph or vector databases.
  • Strong experience in backend API design (REST/gRPC), contract versioning, backward compatibility and interface governance.
  • Understanding of network protocols and performance tuning (HTTP/2, gRPC, TCP/IP behavior, latency vs throughput trade-offs).
  • Experience with asynchronous and event-driven architectures (message queues, streaming systems).
  • Experience with data integrity mechanisms (checksums, hashing, validation, consistency models).
  • Strong experience designing search solutions for complex metadata and dataset discovery.
  • Proven experience with AWS-based backend systems, storage, compute and scalable service design.
  • Experience building secure multi-user systems with authentication, authorization and auditability.
  • Strong architectural thinking, system decomposition and performance optimization skills.

Strong plus

  • Experience with scientific or imaging data systems, ideally microscopy, digital imaging or laboratory software.
  • Experience with microscopy image metadata, OME concepts, image file formats, tiled or multiresolution image handling or scientific imaging pipelines.
  • Experience with S3-compatible object storage, hybrid deployments and on-prem/cloud synchronization.
  • Experience with caching, asynchronous workflows, data pipelines and event-driven architectures.
  • Experience with regulated or quality-driven environments.
  • Experience with access control integration, Keycloak, LDAP/AD or customer-specific identity systems.

Nice to have

  • C++/Bash/PowerShell
  • Docker/Kubernetes/Terraform
  • Vector search or AI/ML data retrieval patterns
  • Experience with life-science ontologies and scientific data standards
Apply Now