Big Data Engineer
Big Data Engineer
Federal Hill Consulting has partnered with a financial firm to build a team of Big Data Analysts. An expert in this role is vested with responsibilities to protect the integrity of US Securities Capital markets. We have a portfolio of ‘surveillance patterns’ that look for manipulative and non-compliant behavior in the database of all transactions that occur in the stock market. The database itself consists of tens of billions of records per day. Analysts work with these large volumes of data using state-of-the-art and industry standard technologies, all of which are wholly operated in a cloud computing environment.
In order to operate with ever improving effectiveness, the team operates in a rich culture of performance and innovation. Collaboration within the team and with business stakeholders is frictionless, with significant independence offered in order to experiment with better ways of conducting technology for business value. Technological and career growth opportunities are a natural and everyday part of the working environment.
Please read further to get a better feel for the role and the skills that lead to success in this team.
- Construct queries to gain an understanding of data sets, discover data anomalies, and look for ways to leverage data in support of our regulatory mission.
- Serve as the subject matter expert for specific surveillance products.
- Analyze impacts of enhancements, issues, or changes to data or surveillance products.
- Interact with stakeholders to understand data cases, document and review requirements, and elicit feedback.
- Build strong relationships with stakeholders and develop an understanding of stakeholder needs for advocacy and planning.
- Use statistical and other advanced techniques to analyze and interrogate data for business insights.
- Assimilate and present results and conclusions to a variety of business and technical stakeholders.
- Collaborate with the delivery team throughout the SDLC process.
- Experience manipulating and querying large data sets with one or more programming languages such as Scala, Python, or SQL.
- Excellent written and verbal communication skills, including documentation.
- Proficiency with Big Data and Cloud technologies.
- Ability to work independently to solve complex problems involving multiple stakeholders.
- Innate curiosity about data and anomaly detection.
- Required: 5+ years IT experience
- Required: Bachelor of Science. Graduate level degrees in Mathematics/ Statistics highly preferred
- Required: Programming - Java / C++ / Scala/ Python
- Required: Experience with Hadoop / Map-Reduce and/or HIVE
- Required: SQL Development
- Required: Unix / Shell scripting
- Required: Designing distributed solutions for parallel processing of large data
- Required: Full SDLC Experience (requirements analysis, design, development, unit testing, deployment, support)
- Required: Good communication skills
- Preferred: Big-Data technologies, Cloud Computing
- Preferred: Test driven development
- Preferred: Understanding NASDAQ/ Capital Markets/ Market Structures
We are an equal opportunity employer and value diversity. All employment is decided on the basis of qualifications, merit and business need