At WeWork, data sits in the center of our business, providing insights into the effectiveness of our physical and digital product & features. We believe data brings everything together and it is the only way we make decisions.
WeWork Central Data & Analytics is a team constantly striving to create an amazing experience for our customers and internal teams. We regard culture and trust highly and believe you will have positive influence in your own way.
If you're passionate about building scalable data & analytics models and architecture, and you are motivated to make an impact in creating robust and scalable data models used by every team, come join us. You will help shape the vision and architecture of WeWork's next generation of central data models, making it easy for analytics, data scientist and other data consumer to build data-driven products, features and insights. You’ll be responsible for developing a reliable data & analytics architecture that scales with the company's incredible growth. You will be a part of an experienced engineering team and work with passionate leaders on challenging data space.
- Lead team to deliver large-scale projects, set and drives roadmap execution through resource planning and allocation
- Architect and design large scale data analytics infrastructure in production (performance, reliability, monitoring, self-service)
- Serve data models as a product to entire organization, including design, implementation and debugging
- Thinking through long-term impacts of key design decisions and handling failure scenarios
- Building self-service platforms to power WeWork's cross functional teams and drive the whole organization to be data-driven
- 7+ years of experience in data engineering
- 3+ years hiring, managing and mentoring teams of data engineers
- Experience working closely with Analytics/Data Science teams
- Excellent communication skills, empathy, initiative and ownership
- Strong background in concept and design in data warehousing and software engineering
- Proficient in SQL
- Proficient in at least one of the programming languages (Python, bash, Java, etc)
- Experience with one or more of the following technologies:
- Database and Data Warehouse Solutions: Redshift, Snowflake, Postgres
- Workflow management: Airflow, Oozie, Azkaban
- Cloud storage: S3, GCS
- Reporting and Business Intelligence Solutions: Looker, Tableau, etc
- CRM Solutions: Salesforce
- Distributed logging systems Kafka, Pulsar, Kinesis, etc
- Batch processing: Spark, Hadoop, etc
- IDL: Avro, Protobuf or Thrift
- Eager to learn new things and passionate about technology
We are an equal opportunity employer and value diversity in our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.