- Design and Build intuitive API (REST/Python/CLI) for data handling, job management, serving processed artifacts, etc.
- Architect and build a scalable backend to handle large amounts of data
- Design and develop large-scale data management platform in production environments
- Design and implement a highly scalable data querying and visualization system , architecting data structures for low latency operation
- Move new feature prototypes at POC phase to production level and all that it entails
- Optimizing data processing software for high throughput and low latency
- Build and deploy our software applications and adapters in production environments of customers
- Build plugins and microservices on top of our core libraries
- Experience with RESTful APIs and API design.
- 5+ years of experience in hands-on professional software development
- Experience in architecting web applications for database backed systems
- Experience with using performance tools and optimization techniques.
- Experience in developing software applications in production environments, handling large amounts of data
- Experience with setting up Docker and container management systems like Kubernetes/EKS/ECS
- Experience deploying applications to heterogeneous environments: cloud, on-prem (private cloud), and end-user (developer, robot).
- Experience with Airflow setup
- Experience with Apache parquet file based columnar storage
- Good knowledge of common ETL packages / libraries and data ingestion.
- Familiarity with Graph search is a plus
- Exposure to AI training and testing toolchains
Job Type: Full-time
Salary: $180,000.00 - $250,000.00 per year
- Health insurance
- Paid time off
- Python: 5 years (Required)
- APIs: 3 years (Required)