WeWork is the platform for creators, providing hundreds of thousands of members across the globe space, community, and services that enable them to do what they love and craft their life's work. Our mission is to build a world where people work to make a life, not just a living, and our own team members are central to that goal.

WeWork manages hundreds of buildings and serves hundreds of thousands of members around the world. Workspace intelligence is critical for building a community that enhances productivity, encourages innovation, and strengthens collaboration.

Workspace Intelligence relies on data — how to gather it, how to analyze it, and what to do with the knowledge derived from it. We invest in areas such as pervasive computing, data management, machine learning, and computational social science to understand the environment, the people, as well as how they interact with others. Furthermore, we also analyze data in the digital world: We uncover social dynamics from digital communication including emails, messages and social networks.


Spatial & People Analytics Engineering Team  is looking for a Data Engineering to building out data ingestion framework. The engineer will build functionality to ingest data from all types of protocols and cloud storage tools, including Kafka, MQTT, Rabbit MQ, S3 source, or relational databases, and other protocols. The biggest challenge of the system is scalability at the size of WeWork, where billions of events throughout the day will be ingested for analytical usage.


  1. Experience programming in one or more of the following languages: Java, Python, C++
  2. Knowledge of real-time protocols, including but not limited to Confluent/Kafka, MQTT, Rabbit MQ, gRPC
  3. Hands on experience of knowledge about data serialization frameworks, such as Apache Avro, Iceberg or Parquet
  4. Knowledge of Cloud products, such as Amazon Web Service (AWS) S3, IoT, or DynamoDb.
  5. Knowledge of remote procedure call, for enterprise usage
  6. Knowledge of Rest API design principles
  7. Knowledge of distributed system design principles, micro services architecture.
  8. Being a good communicator, the ideal candidate will work across functions between different data source engineering team, as well as Product Management and Compliance.


Apply for this Job

* Required