Endless
Possibilities
Trusted by Leaders





I needed a cost-effective transaction monitoring tool which would identify high-risk transactions, flag potential control weaknesses, improve over time through machine learning, reduce the number of false positives reviewed by the compliance team and be user-friendly in terms of configuration and visualization. konaAI delivers on all counts, and I was very pleased with the choice we made.
Amay Sharma
CEO, Unblast
Driving Success Through
Strategic Partnerships




Resources & Articles

Lorem ipsum
Lorem ipsum dolor sit amet consectetur. Mattis venenatis justo ornare rhoncus aenean pretium amet donec. Auctor tempor ultrices scelerisque.
Read More

Lorem ipsum
Lorem ipsum dolor sit amet consectetur. Mattis venenatis justo ornare rhoncus aenean pretium amet donec. Auctor tempor ultrices scelerisque.
Read More

Lorem ipsum
Lorem ipsum dolor sit amet consectetur. Mattis venenatis justo ornare rhoncus aenean pretium amet donec. Auctor tempor ultrices scelerisque.
Read More

Lorem ipsum
Lorem ipsum dolor sit amet consectetur. Mattis venenatis justo ornare rhoncus aenean pretium amet donec. Auctor tempor ultrices scelerisque.
Read More

Lorem ipsum
Lorem ipsum dolor sit amet consectetur. Mattis venenatis justo ornare rhoncus aenean pretium amet donec. Auctor tempor ultrices scelerisque.
Read More
Current Openings
Role Overview:
We are looking for a UI/UX Designer to craft intuitive and engaging user experiences for our AI-powered application. You will be responsible for designing user-friendly interfaces that simplify complex AI functionalities, ensuring seamless interaction for users.
Key Responsibilities:
- Research user needs and AI usability trends to create wireframes, prototypes, and high-fidelity designs.
- Design intuitive interfaces that enhance user interaction with AI-driven features.
- Work closely with AI engineers and developers to ensure seamless design implementation.
- Optimize the user experience based on data-driven insights and user feedback.
- Maintain design consistency with brand guidelines and accessibility standards.
Requirements:
- Proven experience in UI/UX design, preferably in AI or data-driven applications.
- Proficiency in Figma, Adobe XD, Sketch, and design systems.
- Understanding of AI concepts, data visualization, and interaction design.
- Basic knowledge of HTML, CSS, JavaScript is a plus.
- Knowledge on AI/ML/GEN AI is desirable
- Strong problem-solving and collaboration skills.
- BE/Btech/MCA with 4+ years of relevant experience
- Responsive UI/Front end Design & Development using technologies like ReactJS, Typescript, AngularJS, JavaScript, JQuery, HTML5, CSS3 & WebPack
- Deep knowledge of ReactJS practices and commonly used modules.
- Integrations with external Webservices, cross origin requests, Token and sessions management.
- Expertise in Consumption of REST APIs (Synchronous and asynchronous).
- Expertise in Caching and managing Techniques
- Experience in Agile SDLC.
- Certification: Any relevant certification in Python, React JS, Java, cloud platforms (AWS, Azure or GCP or TKGI), or big data technologies will be added advantage
- Soft Skills: Strong problem-solving skills, excellent communication, and the ability to work in a collaborative team environment.
- Analytical Mindset: Ability to translate/understand business requirements into scalable, efficient, and reliable solutions.
- Knowledge on AI/ML/GEN AI is desirable
- Location : Hyderabad
- 3+ years of experience in iOS and/or Android mobile development.
- Strong proficiency in Swift /Objective C(for iOS) and Kotlin/Java (for Android) or experience with cross-platform frameworks like Flutter or React Native.
- Experience with mobile UI frameworks (Jetpack Compose, SwiftUI, or UIKit).
- Familiarity with MVVM, MVP, or Clean Architecture.
- Experience integrating with third-party APIs and SDKs.
- Strong understanding of version control systems (Git, GitHub, GitLab, or Bitbucket).
- Experience with Firebase, Push Notifications, and Mobile Security Best Practices.
- Knowledge of Agile methodologies (Scrum, Kanban).
- Knowledge on AI/ML/GEN AI is desirable
- BE/BTECH/MCA with 5+ years of relevant experience
- Experience in automating the Infrastructure deployment (Infrastructure as Code – IaC)
- Experience in setting up the DevOps technology stack components
- Experience in integrating test automation suites in the CICD Pipeline
- Technology stack experience: Jenkins, Terraforms, AWS Cloud formation, K8S, AWS Code Build, AWS Code Deploy, Chef, Puppet, Harness.
- Experience in automating dynamic environment creation.
- Certification: Any relevant certification in Python, React JS, Java, cloud platforms (AWS, Azure or GCP or TKGI), or big data technologies will be added advantage
- Soft Skills:Strong problem-solving skills, excellent communication, and the ability to work in a collaborative team environment.
- Analytical Mindset: Ability to translate/understand business requirements into scalable, efficient, and reliable solutions.
- Knowledge on AIOPS, AI/ML/GEN AI is desirable
- Location : Hyderabad
- BE/BTECH/MTECH/MS/MCA with 12+ years of relevant experience
- Experience in Cloud Transformation (AWS/AZURE/GCP)
- Experience in Application Modernization & Digital Transformation
- Microservices architecture patterns (Event Driven, Saga, Messaging)
- Experience in Messaging frameworks Kafka/AMQ/Rabbit MQ
- Experience in Springboot/Quarkus/Golang
- Experience in setting up DevOps tech stack, defining CICD pipelines
- Any cloud certification is added advantage
- Programming languages: Java/J2EE, Python, Go
- Containerization experience in Docker, K8S.
- Experience in Database model design, sizing.
- Create HLA, HLD, LLD, capacity sizing, performance engineering & Chaos Engineering.
- Hands-on experience on AI/ML is desirable
- Location : Hyderabad
- BE/B.TECH/MCA with 5-10+ years of relevant experience
- Experience in any of the programming languages:
- Java
- Experience in Microservices development in Springboot/Quarkus
- Experience in Database tables/schema design, Test automation
- Experience in using Co-Pilot as pair programmer will be an added advantage
- Public cloud experience (AWS/GCP/AZURE)
- Expertise in Caching and managing Techniques
- Experience in Agile SDLC.
- Certification: Any relevant certification in, cloud platforms (AWS, Azure or GCP or TKGI), or big data technologies will be added advantage
- Soft Skills: Strong problem-solving skills, excellent communication, and the ability to work in a collaborative team environment.
- Analytical Mindset: Ability to translate/understand business requirements into scalable, efficient, and reliable solutions.
- Knowledge on AI/ML/GEN AI is desirable
- Location : Hyderabad
- Experience: Minimum of 3-15 years of experience as a Data Engineer or ETL Developer in complex, large-scale data environments.
- SSIS, Databricks Expertise: Strong hands-on experience working with SSIS, Databricks, including using Apache Spark for data processing and optimization.
- ETL Tools: Proficient with various ETL tools and frameworks such as Informatica, Talend, or SSIS or DataBricks
- Big Data Technologies: In-depth knowledge of big data processing frameworks like Spark, Flink, Hadoop, Kafka, etc.
- NoSQL/MongoDB: Expertise in working with NoSQL databases, especially MongoDB, for large-scale data storage, retrieval, and optimization.
- Programming Skills: Proficient in SQL , Python or Java, Power shell for building data pipelines.
- SQL and NoSQL Proficiency: Strong knowledge of SQL and experience working with both relational databases and NoSQL databases like MongoDB.
- Data Modeling: Expertise in designing and implementing data models, including OLAP, OLTP, dimensional, and document-based models (NoSQL
- Data Reporting: Experience in integrating with Reporting platforms like Tableau, Qlik, etc
- Data warehousing: Data warehousing is a key part of ETL process, as it stores data from multiple sources in an organized manner and will be needed to build a repeatable ETL workflow to support many different data sources. Experience in building Datawarehouse, Datalakes in Snowflake.
- Redesign and refactor legacy custom ETL processes to reusable ETL workflows that can ingest diverse data sources to normalize data to standard JSON / XML output.
- Data Governance: Knowledge of data governance, security standards, and best practices for managing sensitive data.
- Version Control: Experience with Git or other version control systems for code management.
- Certification: Databricks certification, MongoDB certification, Snowflake SnowPro certification, or other relevant certifications in data engineering, cloud platforms (AWS, Azure or GCP or TKGI), or big data technologies.
- Soft Skills: Strong problem-solving skills, excellent communication, and the ability to work in a collaborative team environment.
- Analytical Mindset: Ability to translate business requirements into scalable, efficient, and reliable ETL solutions.
- Knowledge on AI/ML/GEN AI is desirable
- Location: Hyderabad
- BE/B.TECH/MCA with 5-10+ years of relevant experience
- Experience in any of the programming languages:
- Python
- Experience in Database tables/schema design, Test automation
- Experience in using Co-Pilot as pair programmer will be an added advantage
- Public cloud experience (AWS/GCP/AZURE)
- Expertise in Caching and managing Techniques
- Experience in Agile SDLC.
- Certification: Any relevant certification in, cloud platforms (AWS, Azure or GCP or TKGI), or big data technologies will be added advantage
- Soft Skills: Strong problem-solving skills, excellent communication, and the ability to work in a collaborative team environment.
- Analytical Mindset: Ability to translate/understand business requirements into scalable, efficient, and reliable solutions.
- Knowledge on AI/ML/GEN AI is desirable
- Location : Hyderabad
## Position Overview
We are seeking a skilled Natural Language Processing (NLP) Engineer to join our team in developing and implementing cutting-edge language models and machine learning solutions. The ideal candidate will have strong expertise in Python programming, deep learning frameworks, and large-scale data processing.
## Key Responsibilities
- Design, develop, and deploy production-ready NLP solutions using state-of-the-art techniques and architectures
- Build and optimize machine learning pipelines for training, evaluation, and deployment of language models
- Implement and fine-tune transformer-based models (BERT and variants) for various NLP tasks
- Develop and maintain data processing pipelines using Python, Pandas, and Apache Spark
- Create and optimize SQL queries for large-scale data extraction and analysis
- Conduct experimentation with reinforcement learning approaches for NLP tasks
- Perform feature engineering and data analysis using scientific computing libraries
- Collaborate with cross-functional teams to understand requirements and deliver solutions
- Write clean, maintainable, and well-documented code following best practices
- Monitor and optimize model performance in production environments
## Required Qualifications
- Master's or Ph.D. in Computer Science, Computational Linguistics, or related field
- 5+ years of experience in NLP/ML engineering roles
- Strong programming skills in Python and proficiency with NLP/ML libraries:
- scikit-learn for traditional ML algorithms
- PyTorch or TensorFlow for deep learning
- Transformers library (Hugging Face) for working with BERT and other transformer models
- Extensive experience with data manipulation using Pandas and SQL
- Proven track record of implementing and deploying production ML models
- Strong understanding of transformer architectures and attention mechanisms
- Experience with distributed computing frameworks like Apache Spark
- Solid background in machine learning fundamentals and statistical analysis
## Preferred Qualifications
- Experience with reinforcement learning applications in NLP
- Knowledge of cloud platforms (AWS, GCP, or Azure) for ML deployment
- Familiarity with MLOps tools and practices
- Experience with model optimization and quantization
- Publications in NLP conferences or journals
- Open-source contributions to NLP/ML projects
- Experience with multilingual NLP applications
## Technical Skills
- **Programming Languages**: Python (advanced), SQL
- **ML/DL Frameworks**: PyTorch/TensorFlow, scikit-learn
- **NLP Libraries**: Transformers (Hugging Face), spaCy, NLTK
- **Data Processing**: Pandas, NumPy, Apache Spark
- **Model Types**: BERT, RoBERTa, T5, GPT variants
- **Version Control**: Git
- **Data Storage**: SQL databases, data warehouses
- **Development Tools**: Jupyter, VS Code
- BE/BTECH/MTECH/MS/MCA with 12+ years of relevant experience
- Experience in Cloud Transformation (AWS/AZURE/GCP) projects
- Experience in Application Modernization & Digital Transformation projects
- Experience in handling the projects related to Microservices architecture patterns (Event Driven, Saga, Messaging), Messaging frameworks Kafka/AMQ/Rabbit MQ
- Experience in Springboot/Quarkus/Golang projects
- Understanding of DevOps tech stack, defining CICD pipelines
- Programming language Projects: Java/J2EE, Python, Go
- Project management expertise in Containerization in Docker, K8S.
- Experience in Database model design, sizing.
- Create HLA, HLD, LLD, capacity sizing, performance engineering & Chaos Engineering.
- Certification: Any relevant certification in PMP, cloud platforms (AWS, Azure or GCP or TKGI), or big data technologies will be added advantage
- Knowledge on AI/ML/GEN AI is desirable
Responsibilities:
- Design, develop, and maintain automated test scripts using tools like Selenium, Cypress, Playwright, or similar.
- Develop and maintain test automation frameworks for functional, regression, and performance testing.
- Work closely with developers, product managers, and DevOps teams to define test requirements and strategies.
- Perform API testing using tools like Postman, RestAssured, or Karate.
- Identify, document, and track bugs and issues using JIRA, TestRail, or similar test management tools.
- Conduct performance and load testing using JMeter, Gatling, or similar tools.
- Implement CI/CD pipelines for automated testing in Jenkins, GitHub Actions, or Azure DevOps.
- Ensure software quality through unit, integration, and end-to-end testing strategies.
- Continuously improve test automation strategies and best practices.
Required Skills & Qualifications:
- 4+ years of experience in QA automation.
- Proficiency in programming languages like Java, Python, JavaScript, or C#.
- Hands-on experience with test automation frameworks (Selenium, Cypress, Playwright, TestNG, JUnit, etc.).
- Strong knowledge of API testing (Postman, RestAssured, Karate).
- Experience with CI/CD tools (Jenkins, GitHub Actions, Azure DevOps).
- Understanding of version control systems like Git.
- Experience in working with cloud platforms (AWS, Azure, GCP) is a plus.
- Knowledge of Agile methodologies (Scrum, Kanban).
- Strong problem-solving and analytical skills.
- Knowledge on AI/ML/GEN AI is desirable
Domain Architect
Roles & Responsibilities:
Presales & Solution architecting for BFSI (India)
- Interacting with client's business teams on identified business needs, translating business needs into requirements, and interacting with the practice teams to ensure the right solution to address the client's business needs keeping in mind the winnability of the proposition
- Demonstrated capability in leveraging AI technologies to develop Agentic AI solutions in the Banking and Financial sector.
- Knowledge in Retail Banking, Corporate/commercial banking, Lending, Investments, Pensions, Insurance and personalization domains.
- Strong understanding of ESB/API/Micro services architecture deployment in Financial Services. Good grip on Hardware sizing and Effort estimation for various components involved
- Solution Design the complete solution for the client requirements and take ownership for all aspects of solutioning - Architecture (Functional & Technical), Scoping, Sizing
- Complete ownership of the quality of the proposal and work with the bid team to ensure submission on time.
- Take complete ownership of presenting the solution to the client addressing their business requirement and clearly articulating the solution proposed
- Ownership of PoCs/Demos made to clients
Must Have:
- Atleast 10-15 years of experience working for a Bank or a financial services company/ IT company in the areas of Agentic AI, GenAI, ESB/API/Micro services and atleast one of the public cloud platforms.
- 5+ years of presales experience in Banking and Financial services domain.
- Presentation skills
- Experience in working with multiple teams
- Strong appreciation of technology as an enabler of business decision making
- Appreciation of the effort estimation process to be able to validate the estimates made by delivery teams
- Knowledge of various software solutions in the above areas
-
Roles & Responsibilities:
Presales & Solution architecting for HLS (India)
- Interacting with client's business teams on identified business needs, translating business needs into requirements, and interacting with the practice teams to ensure the right solution to address the client's business needs keeping in mind the winnability of the proposition
- Demonstrated capability in leveraging AI technologies to develop Agentic AI solutions in the Healthcare & Lifesciences sector.
- Strong understanding of ESB/API/Micro services architecture deployment in Healthcare & Lifesciences Services. Good grip on Hardware sizing and Effort estimation for various components involved
- Knowledge in enabling Digitized Health care services such as electronic health/medical records, understanding of patient data and diagnosis records, clinical trial management.
- Solution Design the complete solution for the client requirements and take ownership for all aspects of solutioning - Architecture (Functional & Technical), Scoping, Sizing
- Complete ownership of the quality of the proposal and work with the bid team to ensure submission on time.
- Take complete ownership of presenting the solution to the client addressing their business requirement and clearly articulating the solution proposed
- Ownership of PoCs/Demos made to clients
Must Have:
- Atleast 10-15 years of experience working for a Bank or a financial services company/ IT company in the areas of Agentic AI, GenAI, ESB/API/Micro services and atleast one of the public cloud platforms.
- 2+ years of presales experience in Healthcare & Lifesciences services domain.
- Strong interpersonal skills and Presentation skills
- Experience in working with multiple teams
- Strong appreciation of technology as an enabler of business decision making
- Appreciation of the effort estimation process to be able to validate the estimates made by delivery teams
- Knowledge of various software solutions in the above areas
- Bachelor’s or master’s degree
- At least 12+ years of experience as Domain Architect and/or SME in ERM space across enterprises
- The candidate should have extensive hands-on in the below areas
- Cybersecurity
- Financial Risks
- Operational Risks
- Compliance Risk
- Deep domain knowledge
- Regulatory considerations
- Risk Identification and Assessment
- Risk Mitigation Strategy
- Collaborate with all the stakeholders
- Extensive knowledge and experience in understanding the ERM requirements in above areas and preparing the requirements for the IT systems/applications.
- Provide inputs in the design of the IT systems/applications.
- Architecture and Design IT solutions.
- Act as a bridge between Technical teams and Business teams in ERM space.
- Excellent Communication skills
- Certifications in any of the below will be an added advantage
- Cloud (AWS, GCP, AZURE)
- Bachelor's or master's degree in computer science or a related field
- Experience in SAP ERP architecture, implementations, and upgrades
- Knowledge of SAP Security, IT/OT standards, and ERP infrastructure
- Good understanding of service support processes and SLA
- Strong analytical and conceptional skills
- Define solution models and architecture standards
- Collaborate with stakeholders, including project managers, developers, and IT experts
- Excellent communication skills in English
- Good to have:
- Experience in SAP ERP architecture, implementations, and upgrades
- Knowledge of SAP Security, IT/OT standards, and ERP infrastructure
- Good understanding of service support processes and SLA
- Strong analytical and conceptional skills
- Excellent communication skills
- Bachelor’s or master’s degree
- At least 12+ years of experience as Domain Architect and/or SME in Salesforce
- The candidate should have extensive hands-on in the below areas:
- Responsible for Designing and implementing Salesforce solutions aligning to Business domains:
- Customer Relationship Management (CRM)
- Marketing
- Sales
- Support
- Develop best practices in designing and implementing solutions
- Responsible for Designing and implementing Salesforce solutions aligning to Business domains:
- Define solution models and architecture standards
- Collaborate with stakeholders, including project managers, developers, and IT experts
- Experience on implementing the solutions on public clouds like AWS, AZURE, GCP.
- Excellent communication skills in English
- Good to have Certifications in Salesforce and/or Cloud (AWS, GCP, AZURE)
-
- Experience: Minimum of 15+ years of experience as a Data Architect in complex, large-scale data environments.
- SSIS, Databricks Expertise: Strong hands-on experience working with SSIS, Databricks, including using Apache Spark for data processing and optimization.
- ETL Tools: Proficient with various ETL tools and frameworks such as Informatica, Talend, or SSIS or DataBricks
- Big Data Technologies: In-depth knowledge of big data processing frameworks like Spark, Flink, Hadoop, Kafka, etc.
- NoSQL/MongoDB: Expertise in working with NoSQL databases, especially MongoDB, for large-scale data storage, retrieval, and optimization.
- Programming Skills: Proficient in SQL , Python or Java, Power shell for building data pipelines.
- SQL and NoSQL Proficiency: Strong knowledge of SQL and experience working with both relational databases and NoSQL databases like MongoDB.
- Data Modeling: Expertise in designing and implementing data models, including OLAP, OLTP, dimensional, and document-based models (NoSQL
- Data Reporting: Experience in integrating with Reporting platforms like Tableau, Qlik, etc
- Data warehousing: Data warehousing is a key part of ETL process, as it stores data from multiple sources in an organized manner and will be needed to build a repeatable ETL workflow to support many different data sources. Experience in building Datawarehouse, Datalakes in Snowflake.
- Redesign and refactor legacy custom ETL processes to reusable ETL workflows that can ingest diverse data sources to normalize data to standard JSON / XML output.
- Data Governance: Knowledge of data governance, security standards, and best practices for managing sensitive data.
- Version Control: Experience with Git or other version control systems for code management.
- Certification: Databricks certification, MongoDB certification, Snowflake SnowPro certification, or other relevant certifications in data engineering, cloud platforms (AWS, Azure or GCP or TKGI), or big data technologies.
- Soft Skills: Strong problem-solving skills, excellent communication, and the ability to work in a collaborative team environment.
- Analytical Mindset: Ability to translate business requirements into scalable, efficient, and reliable ETL solutions.
- Knowledge on AI/ML/GEN AI is desirable