NPS Prism is a market-leading, cloud-based CX benchmarking and operational improvement platform owned by Bain & Company. NPS Prism provides its customers with actionable insights and analysis that guide the creation of game-changing customer experiences. Based on rock-solid sampling, research, and analytic methodology, it lets customers see how they compare to their competitors on overall NPS®, and on every step of the customer journey.

With NPS Prism you can see where you’re strong, where you lag, and how customers feel about doing business with you and your competitors, in their own words. The result: Prioritize the customer interactions that matter most. NPS Prism customers use our customer experience benchmarks and insights to propel their growth and outpace the competition.

Launched in 2019, NPS Prism has rapidly grown to a team of over 200, serving dozens of clients around the world. NPS Prism is 100% owned by Bain & Company, one of the top management consulting firms in the world and a company consistently recognized as one of the world’s best places to work. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. 


This pivotal role on the NPS Prism Data Architecture & Engineering team will work under the guidance of the team’s Senior Engineering Manager across a broad range of data-related responsibilities.

The primary tasks of this role revolve around core Data Platform & Products development including working as part of a cross-functional Agile development team to understand the business context and product domain, designing the underlying architecture and solution to meet the product’s functional and non-functional requirements, ensuring the scalability, performance and best practices within the expected volumes and usage patterns. Also, designs and implements solutions with Data Quality, Data Governance and Integrity as key elements, leveraging industry best practices and patterns, also considering automation, reusability and standardization as pillar of the Platform. This Senior Engineer will work with the team to test and validate solution design, and also update and iterate as needed to support new and changing business requirements and product features.


Product Development, Deployment, Support, and Maintenance

  • Own and support the end-to-end Data Architecture & Engineering operations within the Data Platform Squad, from process design and data modeling to implementation and operational support
  • Work with NPS Prism’s Engineering team, Leads and Managers to validate designs, discuss trade-offs and benefits of various approaches, and ensure long-term scalability and performance over time as data volumes and user concurrency grows with product adoption
  • Perform a key role between Engineering & Business areas, translating needs and requirements from Product Team into Engineering initiatives, and bringing the non-tech users up to speed by creating ready-to-use Data Assets and upskilling
  • Assess and propose optimal Data Architecture and Data Modeling approach that optimizes both Product requirements/use cases and the features and assets in the Data Platform.
  • Ensure data models follow all NPS Prism and industry-standard best practices related to selected modeling approach, data security, normalization, naming conventions, key relationships, indexing, constraints, and other considerations
  • Validate and test the solutions design and implementation to ensure it meets all product and business requirements – guaranteeing Data Quality, Governance, Integrity and Security.
  • Collaborate with other multifunctional teams (Operations, Data Visualization, Analytics, Software) on defining the optimal data structures for BI tools to leverage the data, which could include custom views, materialized views, user defined functions/procedures, or optimized SQL
  • Work with internal Infrastructure teams to ensure best practices and standards are followed related to infrastructure/hosting, data security, user access management, permissions, monitoring, patching, and logging/auditing
  • Design, implement, and performance-tune data pipelines that take data in from a variety of sources (cloud storage, flat files, APIs, etc.), run various cleansing/validation processes, and output the data in a defined format for product usage.
  • Work as part of a cross-functional Bain team to successfully deploy NPS Prism to new clients and on-board new data sets/instruments
  • Use industry-leading data/ETL tools for data preparation including data validation, cleansing, joins/mergers, and reformatting into product required schemas for import
  • Use industry best practices for DataOps - development collaboration, versioning, quality checking, automated testing etc (CI/CD, Git)
  • Be an evangelizer of Agile mindset, culture and practices in a pragmatic way to leverage team collaboration, transparency and operational effectiveness
  • Promote Lean and DevOps/DataOps culture, guaranteeing minimal waste and rework
  • Be a promoter of Data Governance as a key strategic pillar of the Data Architecture & Engineering area to guarantee quality, transparency, audit-readiness, documentation and core process throughout our end-to-end data journey
  • Promote Data Democratization within NPS Prism, creating and facilitating training sessions for diverse audiences (technical and business)

Other Responsibilities

  • Keep up to date on various technologies related to data architecture and engineering
  • Participate on technical discovery, POCs, and innovation work streams to validate new tools, technologies, and designs
  • Training, professional development, internal meetings, team building, etc.



  • Bachelor of Science degree in Engineering, Computer Science, Applied Mathematics, , or any other technology-related field preferred, or strong relevant work experience
  • Total of 5-7 years of relevant work experience with a data-engineering focus in a fast-paced, complex business setting. Ideally will have had exposure to both large traditional/multinational companies and startups
  • Relevant Experience with Data Architecture & Engineering paradigms and processes, from ideation to deployment
  • Experience designing, implementing, and supporting end-to-end automated data pipelines using enterprise tooling, cloud-native services, or custom coding/development
  • Business and Data Translation, transform end-user needs for insights into new Data Assets leveraging best practices and solution design
  • Experience in leading small 2-4 people teams within a squad or area, responsible for technical guidance, project deliverables and team coordination
  • Expertise in both conceiving and implementing best-in-class approaches, and in planning scaleup from pilots, MVPs and adhoc solutions to more robust processes and products
  • Experience in modern stack data preparation and analytics tools, especially within Databricks. Knowledge of Snowflake, dbt, Airflow
  • Advanced knowledge of Cloud Platforms and Engineering, especially within Azure.
  • Proficient Python coding and development experience
  • Experience in developing end-to-end Data processes with Quality, Governance, Standardization & Scalability as pillars
  • Strong knowledge of traditional relational dta modelling, database management systems and data warehousing (e.g., SQL Server, MySQL, PostgreSQL, Oracle) and advanced SQL skills (advanced SQL features, custom user defined functions procedures, materialized views, etc)
  • Non-relational database architecture design and paradigms
  • Strong communication and presentation skills, including documenting complex data flows and processes for long-term support and maintenance
  • Must be result-driven, be an analytical and creative thinker, be self-motivated and proactive, be highly organised and demonstrated ability to stay calm and composed in a fast-moving environment
  • Entrepreneurial spirit, innovative mind-set, willing to try new things, think outside the box, test and learn attitude



  • Experience in MLOps is a differential, from a Data Platform perspective (architecture, modules, processes, data assets and standards etc)
  • Data Infrastructure Engineering – IaC (Terraform) and Containers (Docker, Kubernetes)
  • Exposure and knowledge of event-driven and streaming/near-real time architecture (Kafka, MQ, Flink)
  • Enterprise BI platforms (e.g., Tableau, Power BI, Qlik, Domo, Looker)
  • Data Visualization & Data Apps Libraries and Frameworks (Dash Enterprise/Plotly, Streamlit, Highcharts, Amcharts etc)
  • Various Data orchestration, lineage and catalog tools and processes
  • Enterprise cloud-based data ingestion and transformation technologies (e.g., Talend, Informatica, Alteryx Server)