Staff Data Engineer, Analytics

Asapp-2 in Bangalore

At ASAPP, we are on a mission to build transformative machine learning-powered products that push the boundaries of artificial intelligence and customer experience. We focus on solving complex, data-rich problems — the kind where there are huge systemic inefficiencies and where a real solution will have a significant economic impact. Our CX performance platform uses machine learning across both voice and digital engagement channels to augment and automate human work, radically increasing productivity and improving the efficiency and effectiveness of customer experience teams.

The Data Engineering & Analytics team powers the core of our data and analytics products. ASAPP's products are based on natural language processing and serve tens of millions of end-users in real time. We need sophisticated metrics to monitor and continuously improve our systems.

We are seeking a Staff Data Engineer to serve as both a technical leader and the core contributor by designing and building analytic data feeds for both our business partners and internal stakeholders.
ASAPP is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, disability, age, or veteran status. If you have a disability and need assistance with our employment application process, please email us at careers@asapp.com to obtain assistance. #LI-PR1 #LI-Hybrid
    • Lead and deliver projects, working closely with engineers, business partners, product managers, and other stakeholders
    • Engage with business and product teams to understand data requirements and translate those requirements into technical solutions
    • Develop data models, data flows, and integration patterns for reporting applications
    • Contribute to a data lake that facilitates analysis across hundreds of systems events
    • Review code for style and correctness across the entire team
    • Write production-grade Redshift & Athena SQL queries
    • Manage and maintain Airflow ETL jobs
    • Test query logic against sample scenarios
    • Work across teams to gather requirements and understand reporting needs
    • Investigate metric discrepancies and data anomalies
    • Debug and optimize queries for other business units
    • Review schema changes across various engineering teams
    • Maintain high-quality documentation for our metrics and data feeds
    • Contribute to our data infrastructure platform, tooling, and automation
    • Participate in our on-call rotation to keep production pipelines up and running
    • Identify inefficiencies, both technical and business processes, and propose solutions
    • 12+ years of experience in software engineering
    • 8+ years industry experience as a data engineer with demonstrated expertise in building data warehouses and data lakes
    • Extensive hands-on experience in designing dimensional data models, data profiling, ETL/ELT processes, and BI development
    • Experience working with large-scale cloud data warehouse solutions such as Amazon Redshift, Google BigQuery, or Snowflake
    • Expertise in at least one query language such as MySQL, PostgreSQL, or Oracle
    • Proficiency in a high-level programming language, such as Python, Java, Scala, or Go.
    • Technical knowledge of data exchange and serialization formats such as Protobuf, YAML, JSON, and XML
    • Familiarity with reporting tools like Sisense, Power BI,  Looker, Tableau is a plus
    • Experience with workflow management systems such as Apache Airflow or dbt
    • Ability to work independently with minimal supervision and unclear requirements
    • M.S. in computer science, data science, software engineering, information technology, applied mathematics, or statistics
    • Previous experience in a data engineering, data architect, or data analyst role
    • Ability to embrace and quickly ramp up on new technologies and programming languages
    • Experience with temporal data warehouses, anomaly detection, or complex event processing

    • Competitive compensation
    • Stock options
    • ICICI Lombard General Insurance LTD
    • Onsite lunch & dinner stipend
    • Connectivity (mobile phone & internet) stipend
    • Wellness perks
    • Mac equipment
    • Learning & development stipend
    • Parental leave, including 6 weeks paternity leave
Apply