Staff Software Engineer- HAWQ
Headquartered in Palo Alto, Pivotal offers a modern approach to technology that organizations need in order to thrive in a new era of business innovation. Our solutions intersect cloud, big data, and agile development, creating a framework that increases data leverage, accelerates application delivery and decreases costs, while providing enterprises with the speed and scale they need to compete.
Come join a driven, creative, smart and fun-loving company. At Pivotal, you can tackle the most challenging problems, unleash amazing opportunities and build technologies that have a real impact on businesses, people and the world. Every employee has a voice and the autonomy to make decisions and we work together to drive toward tough but rewarding achievements.
We are looking for people who enjoy working in a team, who are dedicated to the whole team winning, and who see communication and collaboration as important skills to learn and develop. Pioneers who will always be proud of the technologies and a world-class company they had a crucial impact in creating at Pivotal. And they had fun doing it!
Engineering | Beijing, China
Pivotal is looking for a few great engineers to join our Apache HAWQ/Pivotal HDB engineering team. Apache HAWQ (currently in incubation) is a distributed Hadoop native SQL engine designed for big data analytics. Pivotal HDB (HAWQ) is built on Apache HAWQ and is deployed throughout the world by demanding customers serving mission-critical applications in financial, telecommunications, retail, and transportation industries.
You have a passion for large distributed systems to manage data on a massive scale.
You love building highly concurrent systems that are fault tolerant and extremely reliable.
You follow current trends in topics such as stream processing and in-memory computing.
You’d really like to believe there’s a way to defy the CAP theorem and get consistency, availability, and partition tolerance all in the same system (even if no one else has managed to do that yet). Above all, you love shipping software as a member of a collaborative team.
At Pivotal, our mission is to enable customers to build a new class of applications, leveraging big and fast data, and do all of this with the power of cloud-independence. Pivotal’s offering includes the Big Data Suite, a complete approach to enterprise data lakes and advanced analytics; Pivotal Cloud Foundry, the industry-leading Platform as a Service product; and world leading ultra-agile application development through Pivotal Labs. Open source is an important part of our strategy. Many of our products are already open source; those that are not will be soon.
The Big Data Suite includes HAWQ, our SQL on Hadoop solution; Greenplum Database (GPDB), our massively parallel data warehouse; GemFire, our distributed in-memory key-value store; and MADlib, our machine learning the solution.
The Apache HAWQ engineering team at Pivotal tackles challenges that come with massively parallel distributed systems operating at extreme scale. We delve into areas like query optimization, parallel query execution, scalable distributed data structures and fault-tolerance paradigms. Here at Pivotal, you'll be working on hard problems with a collaborative team, accelerating your growth as an engineer.
Apache HAWQ engineering is located in our light-filled spacious office in Beijing, China.
To make sure you start your day energized, we provide a catered breakfast every weekday morning. Our collaborative, open-plan office space is filled with talented,
like-minded engineers who enjoy taking advantage of our Tech Talks and hanging out with their co-workers.
And if you work well you will get the opportunity to become an Apache committer.
Desired Skills and Experience
- BS/MS/Ph.D. students in Computer Science or equivalent, with coursework or experience in distributed systems.
- Strong C/C++, particularly in concurrent programming techniques.
- Keen understanding of state-of-the-art techniques and trends in data management, high-scale network applications, and distributed algorithms.
- Excellent communication and collaboration skills.
This role is based in Beijing, China