Lead Data Architect
Interested in working for a nonprofit where the mission is to rid the world of an incapacitating disease?
Looking for an opportunity to collaborate with a global team and create a data platform that meets the organization’s ambitious strategic initiatives?
Currently we are engaged in an Executive Search for a large, national non-profit who strategically realigned their IT organization and created a number of high impact roles, Lead Data Architect.
The Lead Data Architect is a strategic role responsible for leading, ensuring, and shaping technology demand for the organization’s healthcare access, education, and navigation needs. The objective of this role is to ensure the nonprofit is maximizing value by capturing, optimizing, recognizing, and prioritizing the technology projects that will offer the greatest value and responding accordingly. You will play a critical role in maximizing value through technology solutions that serve over 800 staff members in 40+ offices around the United States, thousands of volunteers and tens of thousands of people living with a challenging disease.
The Lead Data Architect will understand the business domain and develop solutions through collaborating with other team members inside the department to achieve meaningful results from strategic initiatives. They will proactively design, implement, and assess procedures that anticipate changes affecting strategy in the environment landscape and triage requests for organizational improvements, technological advancements, and evolving organization needs. This role will be responsible for driving and ensuring results through research, development, and integration of data platforms.
- Data Architect will provide architectural leadership and technical vision for our
Enterprise Data Platform by designing and architecting components across the Data Platform spectrum from Data Collection to Advance Analytics & Data Visualization
- Data Architect will ensure new features and subject areas are modeled to integrate with existing structures and provide a consistent view. Data Architect will develop and maintain documentation of the data architecture, data flow and data models of the data platform appropriate for various audiences.
- Data Architect will provide direction on adoption of Cloud technologies and industry best practices in the field of data platform architecture and modeling, providing technical leadership to large enterprise scale initiatives.
- Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business enablement, maximizes reuse, and protects the ecosystem
- Architect solutions for Performance, Availability, Reliability, Security & Cost
- Develop a comprehensive data architecture capable of supporting various data types (structured, semi-structured, unstructured) and analytics needs from reporting to machine learning.
- Provide a standard common business vocabulary, express strategic requirements, outline high-level integrated design to meet those requirements, and align with enterprise strategy and related business architecture
- Collaborate with business, product, and technology teams early in product lifecycle to influence end-to-end architecture, including functional and non-functional aspects with a keen eye for
data quality, data integrity and data availability
- Create strategies for data transformations and analysis for clinical, image, device, wearable, and demographic data
- Evaluate and implement a variety of data tools (Python, SQL, NoSQL, Talend, Snowflake/Synapse) on Azure to build ETLs/ELTs and data models
- Drive proofs-of-concept initiatives, rapid prototyping with the intent of validating hypotheses.
- Research and evaluate the best-of-breed technologies to inform data architecture decisions, build-vs-buy, and cost/benefit analysis
- Drive platform automation to promote continuous integration/continuous delivery, test-driven development, and streamlined production deployment frameworks
- Drive collaborative reviews of design, code, data, features implementation to drive engineering excellence around total cost of ownership, data quality & process maintainability
- Collaborate to design, implement, & assess solutions & procedures to be compliant with data governance policies and standards
Required Experience and Skills
- Bachelor's Degree in Computer Science, Engineering, or equivalent work experience. Advance degree a plus
- 10+ years’ experience with large-scale, distributed data pipelines, as well as data management, modeling, and storage (structured and unstructured stores)
- Demonstrated experience in architecting and implementing Data Warehouse, preferably using Snowflake or Synapse
- Experience in Data Migration from RDBMS to cloud data warehouse preferably Snowflake or Synapse
- Proven understanding of data governance, data quality, reusable frameworks, and decision support systems design principles
- Deep understanding of modern data engineering practices including scalability, distributed computing, Azure/AWS/GCP cloud infrastructure, containerization, etc.
- Proven ability to inspire confidence, create executive presentations and guide strategic discussions with senior management.
- Ability to understand and adapt to changing business priorities and technology advancements
- Ability to operate in a multifaceted and fast-paced environment, building strategy while executing tactics, including hands-on contributions
- Proven track record of collaboration with technology leaders and business partners to drive impact
- In-depth, hands-on experience using structured and unstructured data as well as key data technologies, including Azure, Snowflake, Talend, Airflow, Kafka, Spark, etc.
- Background in all aspects of software engineering with strong skills in parallel data processing, data flows, REST APIs, JSON, XML, and micro service architecture
- Proficiency in tools / languages such Python, JAVA, Scala, SQL (T-SQL, PL/SQL, Spark SQL), Data Factory, Talend Data Integration/MDM/Catalog.
Particularly expertise in performance optimization of analytics workloads – persistence (hdfs/object storage), storage formats (Parquet/ORC), compression, partitioning, Query layer optimization etc.
- In-depth knowledge of databases (SQL, NoSQL), data modeling, modern ETL processes (batch, lambda, delta) and distributed computing frameworks
- Experience working in a scrum/agile environment, associated tools (Jira), and code management/versioning (e.g., git)
- Excellent written and verbal communication skills
- Ability to prioritize multiple tasks in a fast-paced environment
List any Professional License/Certification Required (i.e., LMSW, LCSW, CPA, CFRE):
Project Management Professional (PMP) Certification
Microsoft Azure Data Engineering (DP-203)
Microsoft Azure Solutions Architect Expert
SnowPro Advanced, Architect Certification (will be great)
As a bonus, it would be great to have:
- Huge Plus if hands-on experience in DevOps, automating, and deploying packages between different environments, & understanding of Containers technologies
- Strong problem-solving skills matched with the ability to articulate solutions to other team members