Big Data Principal Architect
Location: North America (Remote/Work from Home Opportunity)
Future Opportunity with Pythian!
2017 sees Pythian continuing to grow and build a next-generation of experts. This means that although this position is not available today, we are forecasting this position to be active and in “ready to hire” status within the next 3-6 months.
In preparation for this demand, we are welcoming applicants to apply, and if chosen, to continue through the interview process at this time.
Pythian is a global IT services company that specializes in designing, implementing, and managing systems that directly contribute to revenue and business success. We help companies adopt disruptive technologies to advance innovation and increase agility. Our highly skilled technical teams work as an integrated extension of our clients’ organizations to deliver continuous transformation and uninterrupted operational excellence.
Why Commute? Why Relocate? Apply for this remote opportunity!
- Flexible environment: Work remotely from your home!
- Outstanding people: Collaborate with the industry’s top minds.
- Generous vacation: Start with a minimum 3 weeks’ vacation. New baby? Take an extra 2 weeks.
- Substantial training allowance: Hone your skills or learn new ones; experiment and explore using our in-house sandbox; participate in professional development days.
- Fun, fun, fun: Blog during work hours; join our monthly cheese tastings with our resident cheese sommelier in Ottawa; take a day off and volunteer for your favorite charity.
A Principal Architect is a recognized thought leader in Big Data technologies and related business metrics for enterprises and startups alike. They conduct strategic planning of an organization’s Big Data platform maturity with a focus of fulfilling business requirements around cost, scalability and flexibility of the platform. They draft technology roadmaps documenting best practice gaps with precise steps of how to get there. An architect will work with project managers and consultants to ensure that each new iteration of functionality conforms to best practices, which demonstrate easily understood business value, and that design impacts are fully understood across all dimensions of our operation. Finally, they also implement the details of the backlog they helped build in a timely manner and consistent with best practices and customer satisfaction.
- Be a thought leader in Big Data architecture space
- Lead pre-sales and new project estimation process
- Mentor internal and external teams while delivering against an agile backlog
- Model standards of excellence for internal and external written communication
- Develop application data architecture models to further enable effective service oriented delivery.
- Research, evaluate and formally recommend third party software and technology packages
- Audit existing architectures, document best practices and recommendations
- Providing component or site-wide performance optimizations and capacity planning
- Work with client development teams directly to help engineer highly available, more manageable Big Data platforms
- Periodic travel to client sites for face to face meetings and presentations
- Manage and meet the expectations for deliverable completion timelines.
- Experience Architecting Big Data platforms using Apache Hadoop, Cloudera, Hortonworks and MapR distributions
- Strong knowledge of cloud data processing architectures
- UNIX/Linux hands on experience
- Demonstrated knowledge of data warehouse concepts
- Strong understanding of distributed systems architecture
- Be fluent in at least 2 object oriented language, preferably java and python, and have familiarity with functional languages as well.
- Proficient in SQL. Experience with Hive and Impala.
- Experience with several major distributed processing frameworks: Mapreduce, Spark, Spark Streaming, Storm, Flink.
- Experience with Kafka
- Proven ability to work with software engineering teams and understand complex development systems and patterns
- Experience with BI platforms, reporting tools, data visualization products, ETL engines
- Real-time Hadoop query engines like Dremel, Cloudera Impala, Facebook Presto or Berkley Spark/Shark
- DevOps experience
- Experience with Hbase
- Demonstrated understanding of continuous delivery and deployment patterns and tools (Jenkins, Artifactory, Maven, etc)
- This is full-time position
- Ability to work from company office when appropriate or from home office
- Occasional Travel in US/Canada
- Ability to perform primary job functions while sitting for extended periods of time
- Dexterity of hands and fingers (or skill with adaptive devices) to operate a computer keyboard, mouse, and other computing equipment
- The incumbent must be able to spend long hours in intense concentration
LIMITATIONS AND DISCLAIMER
The above job description is meant to describe the general nature and level of work being performed; it is not intended to and should not be construed as an exhaustive list of all responsibilities, duties and skills required for the position. Employees will be required to follow any other job-related instructions and to perform other job-related duties requested by their manager and in compliance with local and federal laws.
Requirements are representative of minimum levels of knowledge, skills and abilities. To perform this job successfully, the employee must possess the abilities and aptitudes to perform each duty proficiently. Continued employment remains on an “at-will” basis.