As a WhiteHawk Business Solutions Architect Big Data Application Architect, you will enable data-driven decision making for operating, maintaining, and continuously improving the WhiteHawk online environment. The opportunities for providing insights are endless, to name a few: data structure and operations to analysis, logistics or process optimization, and innovations with machine learning.
You will design, develop, and maintain the information and insights platform that enables data-driven decision making for the organization. You will support both external AWS customers and internal customers. You will work with business customers and development teams to define analytics requirements and then deliver flexible, scalable, end-to-end solutions. You will work with big data and emerging technologies while driving business intelligence solutions end-to-end: business requirements, data modeling, ETL, metadata, reporting, and dashboarding.
Duties and Responsibilities:
- Work closely with the business and development teams to make sure data and reporting requirements are met
- Design, implement, and support platforms that can provide automated reporting and ad-hoc access to large datasets
- Learn and understand a broad range of Amazon’s data resources and know how, when, and which to use
- Understand user pain points and provide guidance to the team to develop and build solutions
You are a versatile Business Solutions Architect with experience developing report and analysis applications in a large-scale dynamic environment. You have expertise in the design, creation, management, and business use of large datasets. You have excellent technical and business communication skills and can work with others to understand data requirements and to build ETL to ingest data into the data warehouse as well as end-user facing reporting applications. Above all, you should be passionate about working with huge data sets to answer business questions and drive growth.
- A Bachelor’s Degree in Computer Science, Information Systems, Mathematics, Statistics, Finance, Business, related field or equivalent working experience
- 3+ years working experience with Data modeling, SQL, ETL and Data Warehousing or as a Data Engineer with a technology company
- Expert status in writing SQL scripts, Python, Perl, Ruby and with Java and Map Reduce frameworks such as Hive/Hadoop
- Experience with enterprise-class Business Intelligence tools such as Microstrategy, Tableau, Penthao, etc
- Knowledge of AWS products and services and Linux environments
- Excellent verbal/written communication & data presentation skills, including ability to succinctly summarize key findings and effectively communicate with both business and technical teams
- Exposure to NoSQL databases (such as DynamoDB, MongoDB)
- Exposure to predictive/advanced analytics and tools (such as R, SAS, Matlab)
- Exposure to MPP systems, reactive or functional programing
If this sounds like you, please send your cover letter and resume to firstname.lastname@example.org.
We hope you will join us!