We deeply value our client-centric consultants—recognizing their expertise, nurturing their growth, and investing in long-term career opportunities that reflect their impact and dedication. We actively encourage job referrals and offer competitive rewards for successful introductions to qualified candidates.
We are currently hiring for the following positions. Please forward your most recent resume to hr@uniteqq.com.
- Developers (Python, R, Java, Scala, Prolog, Julia) with 4+ years of experience and Strong understanding of data structures, algorithms, and design patterns.
- BI Architect – seasoned Power BI Architect to lead the design, development, and optimization of enterprise-level business intelligence solutions. This role is pivotal in transforming complex data into actionable insights that drive strategic decision-making across the organization
- Snowflake – We are seeking an experienced Snowflake Architect to lead the design, implementation, and optimization of cloud-native data platforms using Snowflake. This strategic role blends deep technical expertise with architectural vision to drive scalable, secure, and high-performance data solutions across the organization. 8+ years of experience in data architecture, data engineering, or cloud analytics roles. Experience in Banking and Financial Services.
- AI/ML Lead Architect – a visionary AI/ML Lead Architect to spearhead the design, development, and deployment of advanced artificial intelligence and machine learning solutions. This role blends deep technical expertise with strategic leadership to drive innovation, scalability, and measurable business impact across industries. Lead the development of GenAI applications, including prompt engineering, contextual tuning, and integration with enterprise systems. Design scalable, secure, and efficient ML pipelines, leveraging cloud-native tools and MLOps best practices. Evaluate and select appropriate AI frameworks, libraries, and platforms (e.g., TensorFlow, PyTorch, Hugging Face, OpenCV)
- Dremio Architect, – lead the design, implementation, and optimization of data lakehouse solutions using the Dremio platform. This role is central to enabling high-performance, scalable, and secure analytics environments across cloud, hybrid, and on-prem infrastructures. Deep expertise in Dremio, Apache Iceberg, Apache Arrow, and lakehouse architecture. Experience with cloud platforms (AWS, Azure, GCP) and hybrid data environments. Familiarity with BI tools (Power BI, Tableau) and data virtualization strategies