Responsibilities:
- Work on next generation UDW data platform which is built with distributed SQL engine upon Hadoop eco-system
- Improve the daily ETL loading process and make it more automated and error prove to handle the increasing daily data volume of our system
- Design the new programming interface of our Data Warehouse, which will be used to connect to its upstream and downstream applications
- Work with the globally distributed team members and counterparts, understand the users ’ requirements, provide production support, and implement solutions for our projects
- Oversee the data quality issues of the daily risk reports, carry the data reconciliation process, explain the differences, and make updates/adjustments to the data as needed
Requirements (indicate mandatory or/and preferred):
Mandatory:
- Bachelor Degree in Computer Science
- Experience with Hadoop ecosystem (Spark/Drill/Kylin and other distributed SQL engines) over 4 years
- Experience with Java core and advanced programming
- Strong Unix shell scripting skills
- Able to communicate verbally in English
- Experience in coordinating work on global scale
- Familiar with testing and development for server side applications, experience with using Spring
- Diligent with unit testing and quality-assurance;
- Able to quickly research and understand new concepts and technologies; and apply to new development
- Excellent communication skill to work with application users and team members
Preferred:
- Experience with Spark/Kafka
- Experience with GemFire
- Experience with database, including UDB, Oracle SQL Server
- Experience with Microsoft Analysis Server and OLAP technology
- Understanding of financial products, specifically risk associated with fixed income and Equities products
这是一个专为移动设备优化的页面(即为了让你能够在 Google 搜索结果里秒开这个页面),如果你希望参与 V2EX 社区的讨论,你可以继续到 V2EX 上打开本讨论主题的完整版本。
https://www.v2ex.com/t/572470
V2EX 是创意工作者们的社区,是一个分享自己正在做的有趣事物、交流想法,可以遇见新朋友甚至新机会的地方。
V2EX is a community of developers, designers and creative people.