Bank of America Risk Analysis Specialist II - Hadoop Platform in Atlanta, Georgia
Risk Analysis Specialist II - Hadoop Platform
Charlotte, North Carolina;Atlanta, Georgia; Chicago, Illinois
Bank of America Merrill Lynch has an opportunity for a Risk Analysis Specialist within our Global Risk Analytics (GRA) function. GRA is a sub-line of business within Global Risk Management (GRM). GRA is responsible for developing a consistent and coherent set of models and analytical tools for effective risk and capital measurement, management and reporting across Bank of America. GRA partners with the Lines of Business and Enterprise functions to ensure that its models and analytics address both internal and regulatory requirements, such as quarterly Enterprise Stress Testing (EST), the annual Comprehensive Capital Analysis and Review (CCAR), and the Current Expected Credit Losses (CECL) accounting standard. GRA models follow an iterative and ongoing development life cycle, as the bank responds to the changing nature of portfolios, economic conditions and emerging risks. In addition to model development, GRA conducts model implementation, data management, model execution and analysis, forecast administration, and model performance monitoring. GRA drives innovation, process improvement and automation across all of these activities.
Overview of the Role:
This role covers the management / governance of storage for GRA’s Hadoop Storage solution. The role requires technical competency’s in mainly python and Hadoop, but project management skills as requires and the ability to engage senior stakeholders.
Key Responsibilities :
The Hadoop Cluster is owned by Technology, however GRA has a large strong technology skilled Quantitative services team that builds technology solutions and works with technology to deliver solutions to production.
General Data Management Hadoop Platform
Large amounts of data are migrated onto the platform and this needs to be monitored carefully to ensure space is not wasted e.g converted sas files are deleted
Internal Data moves are completed - Legacy shared space needs to be migrated to individual AIFs
Common data needs to be analysed and moved out of the legacy area to a more controlled
Legacy area to be decommissioned.
Aged Data – All Hadoop Lanes - A 6 month effort is required to enhance the process to manage the platform more efficiently
Support / Analysis is required to manage the large number of databases & multiple file types on the platform - covering all environments. Currently lower lanes are not being utilised and this may require moving pure development effort off the main platform
Identify non platform data types & Catalogue. Monitor DB for duplication, productionise current aged data reporting, implement consistent file structure on platform
Manage aged data to appropriate level
Small files Management on Hadoop platform - There is a need to clean up and reduce a large number (57m) of small files on the platform under 128MB, These files need to be analysed to understand how they have been generated, many will be from the hive / impala partitioning process.
Analysis will be required to understand which process generates the files.
Technology scripts to reduce / delete the small files will need to be applied and careful attention to executing the scripts is required
Productionise the current Small files reporting process
Management the number of small files to the optimal file size level for the platform
Required Education, Skills, and Experience:
Comfortable speaking / influencing stakeholders at all levels up to Director / MD
Must have Project management / Business Analysis skills / discipline
Ability to communicate with / influence Technology
Good understanding of python. Experience with Pandas, Numpy , Pyspark and ability to manipulate large datasets and performance tune
Good understanding of RDBMS database design, performance tuning, TSQL (Exposure to sqlite, Vertica or Hadoop)
Basic Unix & VBA, Excel knowledge
Good understanding of Hadoop infrastructure (Parquet, Hive etc)
Experience with complex data architecture
2+ years of experience in either Technology or model development
Strong Programming skills e.g. Python, SQL, Unix or other languages
Strong analytical and problem-solving skills
Strong communication skills and ability to effectively communicate quantitative topics to technical and non-technical audiences
Effectively presents findings, data, and conclusions to influence senior leaders
Ability to work in a large, complex organization, and influence various stakeholders and partners
Strong team player able to seamlessly transition between contributing individually and collaborating on team projects;
Ability to extract, analyze, and merge data from disparate systems, and perform deep analysis
Experience with data analytics tools (e.g., Alteryx, Tableau)
Sees the broader picture and is able to identify new methods for doing things
Enterprise Role Overview:
Responsible for performing complex analysis and is engaged in the development of modeling that maximizes profits and asset growth and minimizes credit and operating losses and other risk exposures. Provides analytical support on various product strategies to ensure company goals are met. Coordinates the production of performance reports for senior mgt. Reviews and analyzes trends in current population distributions and recommends strategies. May participate in or develop complex program models to extract data and use databases to provide statistical and financial modelling. Master's degree or large data experience preferred; Programming experience like SQL, SAS preferred and/or Micro Strategy experience preferred. Minimum of 5 years experience.
1st shift (United States of America)
Hours Per Week:
Learn more about this role
Manages People: No
Talent Acquisition Contact:
Bank of America
- Bank of America Jobs