Zepto Systems is a team of industry experts with more than two decades of experience in the IT industry. We believe that every client deserves equal amount of attention, regardless of their business size and industry. We also believe that every business, regardless of their size and budget, should have equal opportunity of competing with their competitors..
Experience implementing end-to-end Continuous Integration and Continuous Delivery (CI/CD) Pipelines (CodeCommit, CodeBuild) and providing best practice advice and recommendations. Strong hands-on experience working with CI/CD pipelines and tools: Jenkins, CodePipeline, CodeDeploy.
Experience in Infrastructure-as-Code both Terraform and CloudFormation
Demonstrable experience of client interaction at all stages
Hands-on experience with AWS services: EC2, ELB, ECS, RDS, Lambda, API Gateway, Route53, S3, SageMaker.
Experience working with containerized applications (Docker / Kubernetes)
Experience with Architecture as code: Terraform, CloudFormation.
Hands-on experience with MLOps tools, such as: Kubeflow, MLFLow.
Production environment experience with AWS
Experience collaborating across multiple functional and/or technical teams to deliver an agile based project
Experience delivering complete solutions utilizing common scripting/programming languages (e.g., Bash, PowerShell, Python, Java, etc.)
Experience working with Linux and Windows Server including Active Directory/SSO
Demonstrated knowledge of cloud networking and security (e.g., identity and access management, firewalls, etc.)
Experience delivering microservice solutions, especially in AWS
Experience of working within a Microsoft Azure as well as AWS would be ideal but not required.
040 hours per weekInformation Technology
3-6 Years
fulltime
Bachelors
23-37 Years
We are looking for AWS DevOps Engineer
Requirements:
Experience implementing end-to-end Continuous Integration and Continuous Delivery (CI/CD) Pipelines (CodeCommit, CodeBuild) and providing best practice advice and recommendations. Strong hands-on experience working with CI/CD pipelines and tools: Jenkins, CodePipeline, CodeDeploy.
Experience in Infrastructure-as-Code both Terraform and CloudFormation
Demonstrable experience of client interaction at all stages
Hands-on experience with AWS services: EC2, ELB, ECS, RDS, Lambda, API Gateway, Route53, S3, SageMaker.
Experience working with containerized applications (Docker / Kubernetes)
Experience with Architecture as code: Terraform, CloudFormation.
Hands-on experience with MLOps tools, such as: Kubeflow, MLFLow.
Production environment experience with AWS
Experience collaborating across multiple functional and/or technical teams to deliver an agile based project
Experience delivering complete solutions utilizing common scripting/programming languages (e.g., Bash, PowerShell, Python, Java, etc.)
Experience working with Linux and Windows Server including Active Directory/SSO
Demonstrated knowledge of cloud networking and security (e.g., identity and access management, firewalls, etc.)
Experience delivering microservice solutions, especially in AWS
Experience of working within a Microsoft Azure as well as AWS would be ideal but not required.
Work collaboratively with Data Scientists, Data Engineers, Data Analysts, DevOps engineers, Cloud Solutions Architects, and other stakeholders including product owners and business analysts, in order to gather, analyse, and understand data engineering requirements Build a range of data products and integrate and manage datasets from multiple external sources, including data extraction, data ingestion, and processing of large datasets Design and develop methods to automate data ingestion from external sources (including different data vendors) into our AWS products and create efficient ETL pipelines. Design, optimise and implement data models and data schemas for different data sources and use cases Build, design, refactor, and optimise AWS data lakes and AWS data warehouses for a variety of data sources Work with a range of storage systems, including relational databases, NoSQL, and others. 2-3 years’ experience in Data Engineering Extensive experiences in AWS services like RDS, Lambda, Glue, Neptune, Athena, DMS, Redshift, EC2 and machine learning tools Skills in core SQL Competencies, such as Stored Procedures, Batch Jobs, and implementing highly performant SQL code for our AWS products Good working knowledge of any of the following: Python, Java, SQL Experience with version control tools (e.g., Git) Aptitude for, and interest in, working in a fast-paced environment Strong verbal/written communication and data presentation skills, including an ability to effectively communicate with both business and technical teams Experience working with data ingestion from APIs, RSS feeds, and FTP Experiences working with either a MapReduce or MPP system Good understanding of data modelling and data engineering tools such as Kafka, Spark, and Hadoop Experiences with Graph databases
040 hours per weekInformation Technology
3-6 Years
fulltime
Not Specified
Not Specified
Work collaboratively with Data Scientists, Data Engineers, Data Analysts, DevOps engineers, Cloud Solutions Architects, and other stakeholders including product owners and business analysts, in order to gather, analyse, and understand data engineering requirements Build a range of data products and integrate and manage datasets from multiple external sources, including data extraction, data ingestion, and processing of large datasets Design and develop methods to automate data ingestion from external sources (including different data vendors) into our AWS products and create efficient ETL pipelines. Design, optimise and implement data models and data schemas for different data sources and use cases Build, design, refactor, and optimise AWS data lakes and AWS data warehouses for a variety of data sources Work with a range of storage systems, including relational databases, NoSQL, and others. 2-3 years’ experience in Data Engineering Extensive experiences in AWS services like RDS, Lambda, Glue, Neptune, Athena, DMS, Redshift, EC2 and machine learning tools Skills in core SQL Competencies, such as Stored Procedures, Batch Jobs, and implementing highly performant SQL code for our AWS products Good working knowledge of any of the following: Python, Java, SQL Experience with version control tools (e.g., Git) Aptitude for, and interest in, working in a fast-paced environment Strong verbal/written communication and data presentation skills, including an ability to effectively communicate with both business and technical teams Experience working with data ingestion from APIs, RSS feeds, and FTP Experiences working with either a MapReduce or MPP system Good understanding of data modelling and data engineering tools such as Kafka, Spark, and Hadoop Experiences with Graph databases
We are looking for D ML Operations Engineer. Requirements:
Strong experience writing production level code in Python.
Experience with AWS: EC2, EKS, Lambda, S3, SageMaker, ECS.
Experience working with containerized applications (Docker / Kubernetes)
Hands-on experience with MLOps tools, such as: Kubeflow, MLFLow.
Strong experience with Architecture as code: Terraform, CloudFormation.
Ability to set priorities, focus and to take ownership and drive a task to conclusion without supervision.
Demonstrable experience in working on collaborative software projects and knowledge of clean software architecture principles
Experience with technical documentation writing to industry standard.
Baseline knowledge of engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations.
Ability and willingness to discover, evaluate, use, and learn new technologies.
Experienced working following agile framework and scrum methods.
Continue to improve the data pipeline
Collaborate with the data science team as required.
Participate in planning efforts, estimation, and peer code-reviews, ensuring high-quality code standards and test-covered development
150,00040 hours per weekInformation Technology
3-5 Years
fulltime
Bachelors
23-36 Years
We are looking for D ML Operations Engineer. Requirements:
Strong experience writing production level code in Python.
Experience with AWS: EC2, EKS, Lambda, S3, SageMaker, ECS.
Experience working with containerized applications (Docker / Kubernetes)
Hands-on experience with MLOps tools, such as: Kubeflow, MLFLow.
Strong experience with Architecture as code: Terraform, CloudFormation.
Ability to set priorities, focus and to take ownership and drive a task to conclusion without supervision.
Demonstrable experience in working on collaborative software projects and knowledge of clean software architecture principles
Experience with technical documentation writing to industry standard.
Baseline knowledge of engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations.
Ability and willingness to discover, evaluate, use, and learn new technologies.
Experienced working following agile framework and scrum methods.
Continue to improve the data pipeline
Collaborate with the data science team as required.
Participate in planning efforts, estimation, and peer code-reviews, ensuring high-quality code standards and test-covered development
business development, cold calling, customer dealing, english fluency, international sales
JOB DESCRIPTION
Zepto Systems
We are looking for Business Development Executive (International Sales)
The role requires significant interaction with clients, which may be over the telephone or email, tasks are varied and include.
Understanding customers' diverse, specific business needs and applying product knowledge to meet those needs
Ensuring quality of service by developing a thorough and detailed knowledge of technical specifications and other features of employers' systems and processes and then documenting them
Cold-calling in order to create interest in products and services, generate new business leads and arrange meetings
Identifying and developing new business through networking and courtesy and follow-up calls
Preparing and delivering customer presentations and demonstrations of the software, articulately and confidently
Maintaining awareness and keeping abreast of constantly changing software and hardware systems and peripherals
Developing effective sales plans using sales methodology
Meeting sales targets set by managers and contributing to team targets
Networking with existing customers in order to maintain links and promote additional products and upgrades
Responding to tender documents, writing proposals.
Managing workload in order to organize and priorities daily and weekly goals
Contributing to team or progress meetings to update and inform colleagues.
40,00040 hours per weekInformation Technology
2-5 Years
fulltime
Bachelors
20-34 Years
We are looking for Business Development Executive (International Sales)
The role requires significant interaction with clients, which may be over the telephone or email, tasks are varied and include.
Understanding customers' diverse, specific business needs and applying product knowledge to meet those needs
Ensuring quality of service by developing a thorough and detailed knowledge of technical specifications and other features of employers' systems and processes and then documenting them
Cold-calling in order to create interest in products and services, generate new business leads and arrange meetings
Identifying and developing new business through networking and courtesy and follow-up calls
Preparing and delivering customer presentations and demonstrations of the software, articulately and confidently
Maintaining awareness and keeping abreast of constantly changing software and hardware systems and peripherals
Developing effective sales plans using sales methodology
Meeting sales targets set by managers and contributing to team targets
Networking with existing customers in order to maintain links and promote additional products and upgrades
Responding to tender documents, writing proposals.
Managing workload in order to organize and priorities daily and weekly goals
Contributing to team or progress meetings to update and inform colleagues.