Pythian

Big Data Principal Architect (Telecommute)

DK-Copenhagen
1 week ago
Requirement Number
2017-0068
Location Other
Remote/Work from home option-
Location Other
GR-Athens
Location Other
PT-
Job Type
Full Time
Career Level
Expert, Senior
Time Zone
EU
Job Category
Big Data

Job Description

Big Data Principal Architect 

Location: UK/Europe (Remote/Work from Home Opportunity) 

 

Why Pythian?

Pythian is a global IT services company that specializes in designing, implementing, and managing systems that directly contribute to revenue and business success. We help companies adopt disruptive technologies to advance innovation and increase agility. Our highly skilled technical teams work as an integrated extension of our clients’ organizations to deliver continuous transformation and uninterrupted operational excellence.

Why Commute? Why Relocate? Apply for this remote opportunity!

 

Pythian Perks

  • Flexible environment: Work remotely from your home!
  • Outstanding people: Collaborate with the industry’s top minds.
  • Generous vacation: Start with a minimum 3 weeks’ vacation. New baby? Take an extra 2 weeks.
  • Substantial training allowance: Hone your skills or learn new ones; experiment and explore using our in-house sandbox; participate in professional development days.
  • Fun, fun, fun: Blog during work hours; join our monthly cheese tastings with our resident cheese sommelier in Ottawa; take a day off and volunteer for your favorite charity.

OVERVIEW:

A Principal Architect is a recognized thought leader in Big Data technologies and related business metrics for enterprises and startups alike.  They conduct strategic planning of an organization’s Big Data platform maturity with a focus of fulfilling business requirements around cost, scalability and flexibility of the platform. They draft technology roadmaps documenting best practice gaps with precise steps of how to get there.  An architect will work with project managers and consultants to ensure that each new iteration of functionality conforms to best practices, which demonstrate easily understood business value, and that design impacts are fully understood across all dimensions of our operation.  Finally, they also implement the details of the backlog they helped build in a timely manner and consistent with best practices and customer satisfaction.

 

RESPONSIBILITIES:

  • Be a thought leader in Big Data architecture space
  • Lead pre-sales and new project estimation process
  • Mentor internal and external teams while delivering against an agile backlog
  • Model standards of excellence for internal and external written communication
  • Develop application data architecture models to further enable effective service oriented delivery.
  • Research, evaluate and formally recommend third party software and technology packages
  • Audit existing architectures, document best practices and recommendations
  • Providing component or site-wide performance optimizations and capacity planning
  • Work with client development teams directly to help engineer highly available, more manageable Big Data platforms
  • Periodic travel to client sites for face to face meetings and presentations
  • Manage and meet the expectations for deliverable completion timelines.

 

QUALIFICATIONS:

  • Experience Architecting Big Data platforms using Apache Hadoop, Cloudera, Hortonworks and MapR distributions
  • Strong knowledge of cloud data processing architectures
  • UNIX/Linux hands on experience
  • Demonstrated knowledge of data warehouse concepts
  • Strong understanding of distributed systems architecture
  • Be fluent in at least 2 object oriented language, preferably java and python, and have familiarity with functional languages as well.
  • Proficient in SQL. Experience with Hive and Impala.
  • Experience with several major distributed processing frameworks: Mapreduce, Spark, Spark Streaming, Storm, Flink.
  • Experience with Kafka
  • Proven ability to work with software engineering teams and understand complex development systems and patterns

 

Bonus/’Nice-to-Haves’:

  • Experience with BI platforms, reporting tools, data visualization products, ETL engines
  • Real-time Hadoop query engines like Dremel, Cloudera Impala, Facebook Presto or Berkley Spark/Shark
  • DevOps experience
  • Experience with Hbase
  • Demonstrated understanding of continuous delivery and deployment patterns and tools (Jenkins, Artifactory, Maven, etc)

 

WORKING CONDITIONS

  • This is full-time position
  • Ability to work from company office when appropriate or from home office
  • Occasional Travel in US/Canada
  • Ability to perform primary job functions while sitting for extended periods of time
  • Dexterity of hands and fingers (or skill with adaptive devices) to operate a computer keyboard, mouse, and other computing equipment
  • The incumbent must be able to spend long hours in intense concentration

 

LIMITATIONS AND DISCLAIMER

The above job description is meant to describe the general nature and level of work being performed; it is not intended to and should not be construed as an exhaustive list of all responsibilities, duties and skills required for the position.  Employees will be required to follow any other job-related instructions and to perform other job-related duties requested by their manager and in compliance with local and federal laws.

Requirements are representative of minimum levels of knowledge, skills and abilities.  To perform this job successfully, the employee must possess the abilities and aptitudes to perform each duty proficiently. Continued employment remains on an “at-will” basis.

 

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed